Beyond Bits and Atoms – Week 6 – Reading Notes


Blikstein, 2015. Computationally Enhanced Toolkits for Children: Historical Review and a Framework for Future Design, Stanford University, USA

The generations of microcontrollers

  • The first generation: Pioneers of physical computing (LEGO/Logo, Braitenberg Bricks, and Programmable Bricks)
  • The second generation: Conquering the World (Crickets,Programmable Bricks, and BASIC Stamp)
  • The Third Generation: Broadening Participation and Accessing New Knowledge Domains (GoGo Board, Phidgets,Wiring,andArduino)
  • The Fourth Generation: New form factors, new architec- tures, and new industrial design (Pico Cricket, Lilypad, Topobo, Cubelets, LittleBits)
  • The Fifth Generation: Single board computers (RaspberryPi, PCDuino, BeagleBoard)

Screen Shot 2016-02-11 at 10.38.31 AM.png

Selective exposure for usability: Embedded error correction

Selective exposure for power: Tangibility mapping

“The main construct proposed in this monograph (selective exposure) and its two subcategories (embedded error correction and tangibility mapping) could help understand the use of current products and give designers a framework to imagine new ones.” (Blikstein, 2015)

Horn, M. S. (2013, February). The role of cultural forms in tangible interaction design. InProceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction(pp. 117-124). ACM.


In this paper I have proposed an approach to tangible interaction design that looks beyond physical analogies and universal sensorimotor experiences. Specifically, I have argued that designers can purposefully evoke cultural forms as a means to activate existing patterns of social activity along with associated cognitive, physical, and emotional resources. This approach to design was inspired by the notion of social and cultural funds of knowledge [8, 18] and by Saxe’s form-function shift framework [30, 31]. Using three examples I demonstrated what this might look like in action.”


Curriculum Construction – Week 6 – Reading Notes


Banks, J. (1993). The Canon Debate, Knowledge Construction, and Multicultural Education. Educational Researcher, 22(5), pp. 4-14.

  • Dominating groups
    • Western traditionalists
    • Multiculturalists
    • Afrocentrism
  • Polarized debate, primarily in popular press, no productive interactions
  • Positionality – started with feminist movement
    • “Positionality reveals the importance of identifying the positions and frames of reference from which scholars and writers present their data, interpretations, analyses, and instruction (Anzaldúa, 1990; Ellsworth, 1989).”, (Banks, 1993, p. 5)
  • Five types of knowledge
    • Personal/cultural knowledge
    • Popular knowledge
    • Mainstream academic knowledge
    • School knowledge
  • The rules of power
    • “Delpit (1988) has stated that African American students are often unfamiliar with school cultural knowledge regarding power relationships. They consequently experience academic and behavioral problems because of their failure to conform to established norms, rules, and expectations. She recommends that teachers help African American students learn the rules of power in the school culture by explicitly teaching them to the students.” (Banks, 1993, p.7)
  • From academia to the classroom – takes time
    • “Consequently, school knowledge is influenced most heavily by mainstream academic knowledge and popular knowledge. Transformative academic knowledge usually has little direct influence on school knowledge. It usually affects school knowledge in a significant way only after it has become a part of mainstream and popular knowledge.” (Banks, 1993, p.11)

Sleeter, C. (1996). Multicultural Education as Social Activism. Albany, New York: State University of New York Press. pp. 91- 115.

  • Multiculturalism as a form of dialogue and acceptance of several points of view
  • Curricula often attempt to include/induce minorities into the dominant’s culture
  • “Oppressors” say that all the differences have been ‘resolved’ in order to maintain status quo
  • Move away from trying to integrate towards discussing and understanding the different

Eisner, E. W. (1993). Forms of understanding and the future of educational research. Educational researcher, 22(7), pp. 5-11.

  • Representations of meaning
    • “Representation, as I use the term, is not the mental representation discussed in cognitive science (Shepard, 1982,1990)but, rather,the process of transforming the contents of consciousness into a public form so that they can be stabilized, inspected, edited, and shared with others.” (Eisner, 1993, p.6)
  • New forms for new understandings – but how to assess these multiple forms that go beyond text and numbers?
  • Must explore
    • “Working at the edge of incompetence takes courage.” (Eisner, 1993, p.10)

Gardner, H. (1999). The Disciplined Mind. New York: Penguin Books. pp. 186-201, 208-213.

  • Enhance understanding by:
    • Providing powerful points of entry
      • Narrative entry points
      • Numerical entry points
      • Logical entry points
      • Existential/foundational entry points
      • Aesthetic entry points
      • “Hands-on” points of entry
      • Interpersonal points of entry
    • Offering apt analogies
      • Powerful analogies and metaphors
    • Providing multiple representations of the central or core ideas of the topic
  • Issues
    • How does one orchestrate the three approaches to important ideas?
    • How does one spread this orientation to the rest of the curriculum – and with might the limitations be?
    • How does one assess the success of such an approach?
    • How might this approach be misunderstood?
    • In the end, what is the status of the true, the beautiful, and the good, and of their possible interconnections?
  • Possibilites and limits
    • Mensures of success
    • Possible misunderstandings of the approach
    • Once more: the true, the beautiful, and the good

Curriculum Construction – Week 6 – Reading Reflection 2 Assignment


“Whether it comes after teaching, while teaching, or by teaching, we often think of assessment as something done to students, not with them.” (Coffey, 2003, p.76)

The word “assessment” is often thought of as the final grade on the report card or as standardized tests that simply rank or classify the student. What it should be thought of is an opportunity for learning and an integral part of classroom activities. In an evolved, mature, and structured teacher-student dynamic, students can create their own quizzes or exam questions, engage in reflections upon their peers presentations or projects, and even grade each other’s tests. The idea might sound radical yet the benefits might outweigh the extra work and planning it might take to ‘flip’ assessment. Being able to understand what quality work is, analyze your own work, receive feedback and act upon it, is a valuable life-long skill.

“When students play a key role in the assessment process they acquire the tools they need to take responsibility for their own learning.” (Coffey, 2003, p.77)

One of the main purpose of assessment is for external accountability but it’s best application might be to improve student motivation, curiosity for learning, and to improve the teacher’s efficacy. It is primordial for students to understand the purpose of their own education and to feel responsible for it. Exposing the students to how they will be assessed and what the enduring understandings the courses will bring to them gives them a sense of purpose of their education. Not knowing why you should learn math or science transforms the whole learning experience meaningless. This disconnect is minimized when the teacher starts by involving the students in creating measures and activities that will demonstrate their understanding of what the course is all about.

“Through the students’ explicit participation in all aspects of assessment activity, they arrived at shared meaning of quality work. Teachers and students used assessment to construct the bigger picture of an area of study, concept, or subject mater area.” (Coffey, 2003, p.78)

To apply this in a classroom requires a significant change in teaching practice. It might feel that engaging the students in everyday assessment practices takes precious time out of regular ‘content-coverage’. Yet this very engagement creates for the students, connections between content and demonstration of knowledge, between their own work and what quality work looks like. It might even make the teacher’s work easier in the sense that the students create their own tests and even grade their own work. They also provide feedback to their peers during presentations and in the process are learning by engaging with the material.  Buy-in from school administrators also should be easy since you can actually use traditional assessments in this process. The topic of large scale assessment in itself is an interesting topic students should be aware of, understand the reasons why they exist, and reduce the stress involved in test taking; but I digress.

So how might you apply this concept in practice? Coffey’s concept of “everyday assessment” fits in well with Wiggins and McTighe’s (2005) Understanding by Design process. The main twist or difference would be that the process of determining acceptable evidence of student understanding would not be done in isolation, but with the students. The teacher would obviously have to provide the course’s goals and essential questions but will do so in a manner that students understand why it is important to their lives (present and future) and how will they know if they actually learned the content.

“Despite initial resistance, as students learned assessment-related skills, demarcations between roles and responsibilities with respect to assessment blurred. They learned to take on responsibilities and many even appropriated ongoing assessment into their regular habits and repertoires.” (Coffey, 2003, p.86)

The process is not easy and it takes time but it provides a sense of clarity for the teacher when planning a course or a lesson. It’s not about the content that has to be delivered, it’s about creating mechanisms that demonstrate student’s learning. It’s about reviving that child’s desire to show-off to their parents what they have just accomplished. It’s about knowing what your parents expect of you and creating a relationship that is based upon growth.


Coffey, J. (2003). Involving Students in Assessment. In J. Atkin & J. Coffey (Eds.) Everyday Assessment in the Science Classroom. Arlington, VA: National Science Teachers Association. pp. 75-87.

Wiggins, G., & McTighe, J. (2005). Understanding By Design. (Expanded 2nd edition) Alexandria, VA: Association for Supervision and Curriculum Development. pp. 13-34, and 105-133.

Brazilian Education – Week 6 – Class Notes



Today we talked about how education is financed in Brazil and the National Education Plan lead by Bob Verhine (Universidade Federal da Bahia).

Basically Brazil spends quite a bit in education in absolute terms but per-student spending is ridiculously low.

The plan has 20 goals which are for the most part unatainable, vague, and/or non-descriptive. The focus seems to be to please all sides, be neutral, and avoid controversy.

Many policies and laws established in our Constitution for example, have never been put into practice…

Here are some points about the challenges to increase expenditures in brazilian education.

Teacher PD – Week 6 – Class Notes



Jigsaw exercise

Science Readings – Group 2
Audience – Teachers

Conceptual framework

  • STeLLA Conceptual Framework
    • Student Thinking
    • Science Content Storyline
  • Deepen content knowledge + STeLLA framework
  • Theory of teacher learning
    • “Theory of teacher learning. The program design was guided by a situated cognition theory of teacher learning and a cognitive apprenticeship model of instruction that view learning as naturally tied to authentic activity, context, and culture (Brown, Collins, & Duguid, 1989; Lave, 1988).”
  • Theory of science teaching and learning
    • “Theory of science teaching and learning. The STeLLA program is based on a constructivist, conceptual change view of science teaching and learning that focuses on making student thinking visible so that instruction can be responsive to students’ emerging ideas, understandings, and confusions and thus support them in developing understandings of science concepts and phenomena that genuinely make sense to them (not just memorized words) “

With regard to our primary research question about program effects, we observed that students whose teachers experienced content deepening integrated with analysis-of-practice in their professional development program (i.e., STeLLA) reached higher levels of science achievement than did students whose teachers received content deepening alone.”

Control group was called comparison group

How did it influence:

  • Research question
    • Content embedded within practice
  • PD Design
    • Center around videos and artifacts
    • Inquiry based
    • Social constructivist where the group comes up with explanation
    • Link science content & pedagogy
  • Research Design

Most important to share?

  • How to do it

What doesn’t matter?

  • Statistics


PD for teacher


  • Grounded on the content of the PD – algebraic reasoning
  • Student thinking
  • Addressing content knowledge
  • Focus on one key idea
  • On site support


LDT Seminar – Week 5 – Milestones Assignemnt


Here’s what I have so far in the definition of my Master’s Project – work-in-progress.

Learning Experience Designer (LXD)
Lucas Longo
v.0, Jan 14, 2016


The trend towards blended learning environments is irreversible and an increasing number of higher educational institutions are going in that direction. It is a labor intensive task for professors who must transition from a traditional classroom or lecture hall model to an online environment. Aside from the learning curve into any LMS, new content must be created and organized: pdfs, images, videos, links, animations to list a few. The challenge is to make it easier for professors who for the most part do not have formal pedagogical training or multimedia content creation skills, to publish their courses adopting the research based best-practices.

Learning Experience Designer (LXD) is a curriculum construction tool that adapts to your teaching context and learner needs. It also provides all the multimedia creation tools you might need to record and edit video, annotate images and pdf, or create animations. It utilizes artificial intelligence to suggest course formats, pedagogical strategies, activities, and challenges providing references to works others have already created and tested. The final result is a published course which can be accessed via your browser or a mobile app where students can engage in forums and peer-to-peer coaching.

As a proof-of-concept, I propose to utilize as a base, an existing LMS (Canvas, Coursera, or Edx) and add onto its interface the proposed functions, content, and interactions. These new features will then be presented and evaluated by teachers who have experience with the LMS. The goals are to judge if such features improve the experience of creating the course and if the resulting course positively affects the learning outcomes. I intend to focus an introductory programming course, a subject matter I am familiar with, where the learning outcomes are more easily assessed, and because of the vast amount of content already available online to support the course.




How might we scaffold “experts” to create engaging hybrid courses?

In 2009 I started a mobile app development school in Brazil targeting developers and designers who needed to acquire these new hot new skills. For the first year or so I taught the iPhone app development course while looking for more teachers to meet the large demand and to create new courses. Pedagogically, I going on instincts, using a very hands-on approach: explain the concept, model it, and do it yourself. It worked and it was straight forward enough to explain to the new teachers.

The challenge came when I started hiring teachers for new courses. The curriculum had to be constructed and the course content created. This task proved to be daunting for the developers who never taught before. Even with my course material as a reference or model, teachers were slow to produce the material, and it was usually of poor quality: slides with too many details or lacking explanations of key concepts.

Once I decided I wanted to start selling the courses online, the challenge became too big. Where do I start? How much video versus written material should I use? How will students ask questions? How will we manage all these students? What are the best practices? All questions that could be resolved by a well designed software that would scaffold the process of creating the curriculum and course content.



LDX will make the user more proficient in the art of sharing their knowledge, stimulating them to repeat the process and create new and better courses. Users will benefit from theory grounded strategies that promote effective learning in online environments. The Virtual Student will lead the process by posing provocative questions and requesting content, assessment, and reflection activities to be inserted into the course progression. The backwards design methodology will be utilized along with the best practices of teaching online.

The main learning outcome will be that online teaching requires a different set of approaches, content, media, interactive experiences, and assessment methods to be effective. The Virtual Student will serve as an instructor and coach for the user during the process – different than a standalone course on ‘how to teach online’. Teaching and learning will occur during the process of creating a course.

To demonstrate the effectiveness of LDX I propose to survey the users pre- and post-utilization of the tool with questions that will inform me of the following characteristics of the user:


  • Digital literacy level
  • Previous knowledge/experience with pedagogy
  • Previous knowledge/experience with online teaching
  • Perception of online course effectiveness
  • Personal beliefs on the challenges of creating an online course
  • Confidence level for creating an online course


  • New pedagogical content acquired
  • New online teaching content acquired
  • Perception of online course effectiveness
  • Personal beliefs on the challenges of creating an online course
  • Confidence level for creating an online course

During the utilization of the tool I intend to collect the following data:

  • Webcam video recording
  • Screen video recording w/ mouse tracks and clicks
  • User will be asked to think-aloud throughout the process

I also intend to test LDX with users who have already created online courses and interview them to get the following:

  • Perception of how much LDX actually helped them in the process
  • What would they do different now if they were to redo their existing courses
  • Input and feedback on what worked, what didn’t work, and suggestions

The results will be interpreted using the grounded theory qualitative research method. Theories of how to improve the tool will emerge from the evidence coding and proposition creation. The conclusion will address issues such as the viability of the concept, effectiveness, and suggestions for future improvement.


The approach to learning that informs my design is a combination of the Protege Effect, project-based learning, and TPACK. The expert, teacher, or content creator is here called the “user” insofar as it is the person who is interacting with LXD during the course creation process.

The Protege Effect will be elicited through a virtual student who will prompt the user to teach him by asking leading questions, making suggestions, and warning the user about excessive use of one style of teaching as well as the lack of content, reflection opportunities, or detailing of previous knowledge. The virtual-student closes the gap between the content ideation and the actual student’s experience. Through immediate feedback, the virtual student will elicit the user to think deeply about content choices and aid in the process of deciding the learning progression that must be in place.

The project-based learning approach simply entails that the user is engaged in a project while using LXD itself. The project is the course creation process itself, within which scaffolds are presented to the user. In addition, the approach utilizes backwards design principles embedded in the interactions the virtual student has with the user. The idea being that the heuristics and strategies invoked by the virtual student are guided by these approaches without necessarily making them explicit.

Finally, LXD aims at increasing the user’s Technological Pedagogical Content Knowledge (TPACK) by offering simple media editing tools, pedagogical scaffolds, as well as content produced by other users that can be incorporated into the course creation process. LXD in itself is a technological tool that will increase the user’s TPACK by presenting the necessary information, background knowledge, and content that supports the user’s ideation and publication strategies.


Existing solutions (“competition”):

LXD is a construct that for the purposes of this project, will build upon an existing LMS or CMS – let’s call it LCMS for simplicity. This LCMS provides a base from which to start off with. Creating an entirely new LCMS from scratch is unfeasible and not necessary in order to test LXD’s effectiveness. I am currently analyzing which platform will be the best suited for this project. Here’s a list of the ones I have shortlisted:

  • Moodle – open-source CMS
    • Positives
      • Total freedom to create
      • Mature platform with thousands of plugins
      • Large community to interact with
    • Negatives
      • Cumbersome to customize
      • Old looking base interface
      • Old HTML base – no use of HTML5 affordances
  • Coursera
    • Positives
      • Could work with existing content publishers on Stanford
    • Negatives
      • The focus is curating online courses done with any online tool (I think)
  • Udemy
    • Positives
      • Content publishing tools is one of the most user friendly I’ve seen
    • Negatives
      • Would have to negotiate with Udemy access to their platform’s source code
  • Udacity
    • Positive
      • Focused on tech courses – familiar to me
    • Negative
      • Have never seen their course publication tool
      • Would have to negotiate with Udacity access to their platform’s source code
  • EdX
    • Positive
      • Candace Thille might have contacts to get access to the company
    • Negative
      • Have never seen their course publication tool
      • Would have to negotiate with EdX access to their platform’s source code

At the moment, Udemy is looking like my favorite candidate.

I also intend to talk to VPTL at Stanford to understand what are the usual difficulties professor have in the process of creating their online tools.


LDX will be a web-based tool which will overlay the existing LCMS with text, image, and video triggered by analyzing the steps and content being published in the course. This is where artificial intelligence comes to play. Let’s say that the user has published a 30 minute video – LDX might suggest that the video should be shorter. If the user publishes 50 pages of text with no images, LDX might suggest that images illustrate concepts more powerfully that text alone. LDX might prompt the user to insert a knowledge-check or reflection activity once the user has published 5 pieces of content. The idea is to provoke the user to think about how the learner will be processing the content towards learning.  

The key features of LDX are:

  • Virtual Student
    • 3D character that talks to the user
    • Guides the user through the process of creating the content
    • Asks questions about the content and format of the course as it is created
  • Media Editing Suite
    • Video editor
    • Image editor
    • Text editor
  • Course Publication Tool
    • Create course structure and progression
    • Add media
    • Create assessments (quizzes, multiple choice, reflections, and etc)
  • Curated Content
    • Access to similar courses to get examples
    • Ability to link to external material for student’s reference
  • Coaching
    • Peer-to-peer help to go through courses

This is definitely ambitious for the time and resources I have for this project. The biggest challenge, other than the sheer volume of features, is the Virtual Student. I would have to partner up with someone who has experience and access to such technology in order to create a prototype.


Benefits: How will you show that what you’re doing is 1) working (usability studies) and 2) helping learners learn (learning assessment)?  What kinds of evidence do you plan to collect (summarize here; put survey items, think-aloud protocol, transfer problems, etc. in an appendix)? Will you use questions/frameworks from published studies, or create your own? What target learners will you interview/observe, where and under what circumstances? How do you expect to analyze and interpret your findings?


What were the major ideas  in this project? What are you most looking forward to?


(length as needed)

This proposal should use exisiting scholarship to justify, explain, and extend what you propose to do.  In addition to being mentioned in the text, this list of sources that are cited in this proposal provides instructions for how to find the original reference, in case the reader wants to know more about what that author said. Please use one standard format, such as for example APA:



(1 page max.)


Milestones and deliverables

When do you need to do what, in order to finish on time? Example:

Winter quarter Observe target learners

Develop ideas

Write proposal

March 20, 2015 Proposal draft submitted to advisor
Date Participants for user testing and learning assessment arranged
Date Low-res learning assessments complete
Date Low-res prototype studies complete
Date Round 2 learning assessments complete
Date Round 2 prototype design complete
Date Final user testing and learning assessment complete
July 20, 2015 Project logo and video submitted
July 31, 2015 EXPO presentation, demo
August 6, 2015 Draft report submitted
August 13, 2015 Signed Master’s Project Form submitted

Time needed to implement project

Include your time and time of others. Okay to barter and trade skills with each other. You can learn new skills, but include the time it takes.



(1 page max.)


Funds needed to implement project

If you had a small budget to spend, what would you want to use it for? Think about thank-you gifts for testers, consultants, software, supplies. Be creative in thinking about how to leverage limited funds.


Every project that submits a budget here will be given a stipend of up to $200 per student to cover those needs.  Budgets should be specific enough to show where the funds go, but need not itemize every More funds may be available (apply and wait for approval before spending the funds!).


Item Approximate Cost



(1 page max.)


Collaboration (For Team Projects)

Explanation of how you intend to work with others on the team. How will the work and responsibilities be shared? How will individuals’ contributions be incorporated into the group product?



List of cooperating contributors to your project. These are people outside the project team whom you expect to consult with or who will provide support for the project. What tasks will you need help with (e.g., coding, graphics, connecting with target learners)?  In a real-world proposal they would write letters of support. Here, just list names & their contributions.


For individual projects, how will you set up your environment to give you the feedback and support you need along the way?



(length as needed)


Anything else you think you need to share with your reader. Do not assume appendices will be read; these are reference materials that provide the opportunity for the reader to go deeper should she or he so desire. Summarize the message or insights gleaned from these materials in the text of your proposal.  Example: annotated list of competing products.


Brazilian Education – Week 6 – Reading Notes



  • Artigo analisando as leis relacionadas ao financiamento público da educação – Projeto de Lei do Plano de Educação (PNE)
  • Altíssimo desequilíbrio entre recursos para ensino superior vs ensino básico e fundamental
  • Composição complicadíssima em termos de regras de repasses
  • Gastos com educação em geral são baixos com relação ao OCDE
  • Gastos por aluno são MUITO mais baixos com relação ao OCDE

Screen Shot 2016-02-08 at 9.14.32 AM.png

Menezes, N., 2014, O Plano Nacional de Educação Publicado, Valor Econômico

  • “PNE foi concebido pelos movimentos corporativistas para conseguir transferir mais recursos da sociedade para si próprios, sem que haja cobrança de mais eficiência na aplicação desses recursos.”
  • “Aumentar os gastos simplesmente aumenta a mobilização dos movimentos organizados da sociedade para capturar esses recursos.”
  • “Uma das únicas notícias boas do PNE foi a estratégia incluída pelo Senado federal, que estabelece políticas de estímulo às escolas que melhorarem o desempenho no Ideb, “de modo a valorizar o mérito dos professores, diretores e funcionários”. Ela foi incluída apesar da resistência dos movimentos corporativistas, que são contra a meritocracia.”

Menezes, N., 2012, Mais gastos com educação? Valor Econômico 

  • “Em primeiro lugar, deve ficar claro que o PNE é somente uma carta de intenções. Nada garante que as metas serão efetivamente atingidas.”
  • “Hoje em dia, o Ensino superior apropria 15% dos gastos públicos com Educação, mas tem apenas 3% do total de Alunos.”
  • “em nenhum país do mundo essa discrepância de gastos entre o Ensino básico e o superior é tão grande.”
  • “Por fim, vale a pena ressaltar que aumento de gastos não significam aumento da qualidade da Educação. Várias pesquisas, inclusive da OCDE, mostram esse fato de forma inequívoca. Sem melhorar a formação dos Professores, a seleção dos diretores e sem demitir os piores Professores ainda em estado probatório, nada vai mudar, mesmo que gastássemos os 10% do PIB com Educação. Haveria somente uma maior transferência de recursos da sociedade para os Professores, sem melhoria do aprendizado dos Alunos.”

Maílson Ferreira da Nóbrega, M. F., 2014 Escolha fatal: 10% do PIB para a educação. VEJA

  • “A educação não precisa de mais dinheiro, e sim de uma revolução na gestão e na forma de remunerar os professores, para melhorar sua qualidade. A lei pouco ou nada contribuirá para isso. Tende a ser uma escolha ruinosa.”

Almeida, M. & Pessoa, M. L. S., 2015, Desequilíbrio econômico é estrutural e exige correções mais duras. Folha de S. Paulo

  • “Certamente, ocorreu um grave descontrole dos gastos públicos a partir de 2009. Para além dos problemas de curto prazo, porém, existe um desequilíbrio estrutural. Desde 1991, a despesa pública tem crescido a uma taxa maior do que a renda nacional.”
  • “Nesses 23 anos, o setor público apropriou-se de 45% do crescimento da renda nacional para financiar seus gastos, incluindo programas de transferência de renda e demais políticas públicas.”
  • “Como cada idoso custa aproximadamente o dobro do que uma criança na escola, o quadro torna-se mais dramático. Ponderada pelo custo de cada grupo, a dinâmica demográfica tem acarretado aumento do gasto público há mais de uma década.”
  • “O grave problema fiscal do Brasil reflete a concessão desenfreada de benefícios incompatíveis com a renda nacional. Prometemos mais do que temos, adiando o enfrentamento das restrições existentes. Deixamos para as próximas gerações as contas a serem pagas, porém o futuro tem o inconveniente hábito de se tornar presente.”

Congresso Nacional, Plano Nacional de Educação

  • Um plano com 10 metas fica difícil de se cumprir
  • Não há penalidades previstas para o não cumprimento das metas, nem prêmios
  • Promoção de 2 (sim – somente duas) conferencias nacionais de educação até 2020!!!

Teacher PD – Week 6 – Reading Notes


Little, J.W. (2004). ‘Looking at student work’ in the United States: a case of competing impulses in professional development. In C. Day & J. Sachs (Eds.) International Handbook on the Continuing Professional Development of Teachers (pp. 94-118). UK: Open University Press.

  • Looking at student work
    • Root teacher learning in and from practice
      • Deep understanding of how children learn
    • External control of teaching and teacher education
      • Standards
      • Controlling practice
      • Exercising sanctions
    • More attention to
      • School reform
      • Public accountability
  • Contradictory purposes
    • Stimulating and supporting teacher learning
    • Instructional decision making
    • Bolstering teacher community
    • Advancing whole-school reform
    • Satisfying demands for public accountability
  • Learning in and from practice
    • Not as many as we would wish for
  • Escalating accountability pressures
  • Looking at student work: profiles of purpose and practice
    • Principal rational for looking at student work
    • Programmatic choices
    • Purposes
      • Deepen teacher knowledge
      • Strengthen teacher’s instructional practice in specific subject domains
      • Collective capacity for improvement in teaching and learning at the school level
      • Review of student work in the service of standards implementation and external accountability
  • Student work as a resource for deepening teacher knowledge
  • Student work as a catalyst for professional community and school reform
  • Student work as an instrument of external accountability
  • Multiple purposes – complementary or competing?
    • Tension
      • Teacher-defined inquiry and compliance with external standards
      • Research-based model and honoring teachers’ own interests and expertise
  • Teacher PD has many purposes, but hard to do them all in one
    • Depth of understanding in particular subject domains
    • Professional norms of mutual support and critique
    • Expectations for both internal and external accountability regarding students’ opportunity to learn
  • Teacher PD must account for local resources and knowledge to promote teacher community at the school site
  • Audit society
    • Leaves no room for experimentation
    • Teachers have little opportunity to reflect
    • No support from cohort
  • Contributions and limitations of research
    • ‘Value added’: contribution to teacher knowledge and practice
      • Triangle studies
        • Teacher development
        • Classroom practice
        • Student learning
      • Instructional triangle
        • Teacher
        • Student
        • Curriculum
    • Moderation
      • Make sure scores are being interpreted in the same way by all
    • Limited scope of research, expanding the scope of practice
      • “The result: a growing arena of practice that remains weakly positioned to capitalize on research, and research weakly attentive to expanding contexts of practice.” Little, 2004)

Little, J.W., Gearhart, M., Curry, M., & Kafka, J. (2003). Looking at student work for teacher learning, teacher community, and school reform. Phi Delta Kappan, 85, 185-192.

  • Looking at student work usually occurs in isolation
  • Common elements of practice
    • Bringing teachers together to focus on student learning and teaching practice
    • Getting student work on the he table and into the conversation
    • Structuring the conversation
  • What seems to work
    • Flexible, creative use of tools for local purposes
    • Ability to exploit subject expertise and examine subject issues
    • A balance between comfort and challenge
    • Facilitation to build a group and deepen a conversation
  • Three Dilemmas in making the most of looking at student work
    • Concern for personal comfort and collegial relationships
    • Scarce time, many interests
    • Uncertainty about what to highlight in “looking at student work”

Science Readings

Roth, K.J., Taylor, J.A., Wilson,C. D. & Landes, N.M. (April, 2013). Scale-up study of a videocase-based lesson analysis PD program: Teacher and student science content learning. Paper presented at the 2013 NARST Annual International Conference, Puerto Rico.

  • The Problem: Science Content Knowledge for Elementary Teachers
    • “Content knowledge is insufficient to identify and address children’s misunderstandings (Roth, Anderson, & Smith, 1987); in fact, teachers sometimes hold the same misconceptions as their students.” (Roth, Taylor, Wilson, & Landes,  2013).
    • Science Teachers Learning from Lesson Analysis (STeLLA)
      • Look at videos of practice
      • Student Thinking Lens
        • Strategies to reveal, support, and challenge student thinking
          • Ask questions to elicit student ideas and predictions
          • Ask questions to probe student ideas and predictions
          • Ask questions to challenge student thinking
          • Engage students in interpreting and reasoning about data and observations
          • Engage students in using and applying new science ideas in a variety of ways and contexts
          • Engage students in making connections by synthesizing and summarizing key science ideas
          • Engage students in communicating in scientific ways
      • Science Content Storyline Lens
        • Identify one main learning goal
        • Set the purpose with a focus question and/or goal statement
        • Select activities that are matched to the learning goal
        • Select content representations matched to the learning goal and engage students in their use
        • Sequence key science ideas and activities appropriately
        • Make explicit links between science ideas and activities
        • Link science ideas to other science ideas
        • Highlight key science ideas and focus question throughout
        • Summarize key science ideas

Roth, K. et al (in press). The Effect of an Analysis-of-Practice, Videocase-Based, Teacher Professional Development Program on Elementary Students’ Science Achievement. Journal of Research on Educational Effectiveness.

  • Few rigorous studies about the relation of PD and student outcomes
    • “…few studies have tested causal relationships between teacher PD programs and student outcomes (Roth et al., 2011; Yoon, Duncan, Lee, Scarloss, & Shapley, 2007; Sleeter, 2014) and even fewer have used rigorous research designs.” (Roth, 2016, p.1)
  • Elementary teachers have little science content knowledge
    • “These problems are especially prevalent for elementary teachers who have little training in science-specific pedagogy or in the science disciplines they are expected to teach (Dorph et al., 2007, 2011; Fulp, 2002; Smith & Neale, 1989; Stoddart, Connell, Stofflett, & Peck, 1993).” (Roth, 2016, p.2)
  • Growing consensus that professional development should:
    • Engage teachers actively in collaborative analyses of their practice;
    • Treat content as central and intertwined with pedagogical issues;
    • Enable teachers to see these issues as embedded in real classroom contexts;
    • Focus on the content and curriculum teachers are teaching;
    • Be guided by an articulated model of teacher learning that specifies what knowledge and skills teachers will gain, what activities will lead to this learning and how this new knowledge and skills will appear in their teaching practices (Ball & Cohen, 1999; Darling-Hammond & Sykes, 1999; Desimone, 2009; Elmore, 2002; Garet et al., 2001; Guskey & Yoon, 2009; Hawley & Valli, 2006).
  • STeLLA’s shortcomings
    • Internal validity – control group was the same group of teachers one year before
    • External validity – samples drawn from urban schools in a single geographic region
    • Scalability – program developers delivered the PD – how would a PD facilitator have implemented it?
  • STeLLA suggested improvements
    • Random assignment of schools in the study to the program or a comparison condition;
    • The inclusion of a diverse sample of schools in the study;
    • A specified comparison condition that matched the treatment condition in duration, intensity, and contact hour
    • A treatment delivered entirely by PD providers who were not developers of STeLLA.

Teacher PD – Week 5 Reaction: The Role of Video in Teacher Learning



Read the responses your colleagues wrote about the reading and react to them.

(see responses below)

The reading responders did a clear job of relating Gaudin and Chaliès (2015) framework with van Es and Sherin’s (2009) “video club” and Gröschner, Kiemer and colleagues’ Dialogic Video-Cycle (DVC). While some focused on the methodology of the studies, others raised interesting questions about the research paper and the information it offered. Some common themes emerged from the responses, besides their descriptive nature.

All reviewers, for both the video club and DVC, seem to agree that they adhere to Gaudin and Chaliès framework and were successful in achieving their goals. In all cases, the effect of watching video in PD seem to be positive especially on teacher’s motivation and consequently, the students’ interest in the discipline. Teacher’s metacognitive abilities also were positively affected in both cases. The teacher’s reactions to the PD were also positive, giving them a sense of competency and self-efficacy. In both cases teachers watched videos of their own practice and were scaffolded for “selective attention” and “learning to notice”.

Another common thread was that the responders all felt that there was room for more information in the papers. The decision process of selecting and editing the videos that best fit the PD experience was not fully delineated. For one responder the video selection process was clear, yet only in terms of its mechanics and procedures. Little mention is made about content choices involved in editing the videos. The responders also felt that more information about the facilitator’s role and the methodologies applied during the workshops themselves. Focus was given on the video aspect of the PD, diminishing the level of detail presented about the interactions between the facilitator and the teachers.

Finally, some responders mentioned the desire of more information about how these video-based PDs compare to other non-video-based PDs the teachers have already experienced. In general we can see that teachers are satisfied with the results of this kind of PD but no information is given about what specifically they perceived as different and better about this methodology. One glimpse into this comparative notion is given by a teacher in the control group in the DVC study, who says that they would have desired more direct feedback. The teachers in the study did not report this having watched videos of their own practice and reflected upon them.

I personally found that this exercise was of surmount importance to gain new lenses in the papers we read. It is fascinating how each person notices, deems relevant, and draw different information from the same source material. Clearly previous knowledge, experiences, and context transform the lenses through which each one of us sees the world. Being exposed to these different lenses enhances and expands our own understanding. In addition, reacting to these responses engages us even further with the topic and helps us see patterns we might have not noticed before. In this sense, I believe that video-based PD is an exemplar method of achieving the desired learning objectives with teachers. Videos present a concrete source of information and basis for reflection that is powerful, objective, direct, and close to one’s own practice. Obviously well guided facilitation for the reflection process is required, as in any learning experience, yet I believe that once a teacher becomes aware of how to analyze videos, self-guided learning may happen voluntarily.

Response 1/5

Gaudin and Chaliès (2015) conceptualize the process of video viewing in professional development (PD) in the following four broad categories:

    1. Teachers’ activity as they view a classroom video (e.g. view video with selective attention; and view video and knowledge based reasoning).
    2. Objectives of video viewing in the PD context (e.g. build knowledge on “how to interpret and reflect”; build knowledge on “what to do”; hybrid approach; and objectives based on learning goals).
    3. Types of videos viewed (e.g. unknown teacher activity, peer activity, own practice, selecting and organizing videos in line with learning goals and contexts).
    4. Effects of video viewing on teacher education and professional development (e.g. teacher motivation, cognition, classroom practice, recommendations for effective video viewing).

In the context of Guadin and Chaliès’ (2015) work, van Es and Sherin (2009) “video club” is a case viewing video with selective attention with a primary focus of “learning to notice” student ideas in elementary math classrooms. This one year video club PD program considered both the participants’ own practice and the practice of their peers as the PD / research team captured classroom teacher of participants and created experiences for teacher participants in meeting this teacher learning goal. The research team had set out to capture how this type of video club would influence teachers’ thinking and practice and was able to capture changes in both teachers’ thinking and practice over the course of the project. The were able to capture how teachers “made space for students’ thinking, … more frequently probed students’ thinking, and … took on the stance of the learning in the context of teaching” (p. 169).

As I read the van Es and Sherin (2009) work, I wanted to know more about how the facilitators designed the content and processes of the PD in the context of the elementary math knowledge, processes, and dispositions that were expected in each of these classrooms. There was little mention as to how the videos were edited, the types of questions that were used to facilitate teachers’ journeys toward the learning goals—which themselves are on the vague side. Noticing student ideas and providing opportunities for students to express them is certainly a step for teachers to understand how students are approaching the mathematics, but it is unclear whether through this video club participation, teachers were developing knowledge as to how to move students along the learning continuum. I think about Schoenfeld and Floden’s “Teaching for Rubust Understanding in Mathematics” dimensions and am searching for greater clarity as to how van Es and Sherin (2009) think about how teachers consider the mathematics, the cognitive demand, the access to mathematical content, students’ agency, authority, and identify, and use of assessment practices in shaping their practice and how students learn math. I’m also curious as to how this particular PD experience coheres or conflicts with other PD experiences that teachers have taken part of and how that knowledge shapes their beliefs and actions over the course of the project period.

Response 2/5

In the following paragraphs I offer an analysis of Gröschner, Kiemer and colleagues’ Dialogic Video-Cycle (DVC) through the lens of Gaudin et al.’s four-principle conceptualization of how video viewing is used in teacher PD. In each area, we can see close alignment between the DVC model and what Gaudin et al. point to in their research analysis as best PD practices regarding video use.

The nature of teachers’ activity as they view a classroom video

In their video viewing conceptualization, Gaudin et al. (2015) point to selective attention as a key component of productive teacher video viewing activity. The video viewing structure in Gröschner, Kiemer and colleagues’ DVC model very much supported selective attention by designating a specific focus during each analysis session. In the first video analysis and reflection session, for example, (workshop 2 of each DVC cycle) the facilitator selected clips and posed questions that guided the participants to focus on the ways in which the teacher in the video activated student engagement. In the second analysis and reflection session (workshop 3 of each DVC cycle) facilitators used video clips from the same lessons, but this time guided the participants’ attention to the ways in which the teacher scaffolded student ideas. This guiding of attention is very much in keeping with Gaudin et al.’s description of selective attention during video viewing.

Gaudin et al. (2015) explain that another important aspect of video viewing during PD is knowledge-based reasoning. The extent to which the teachers in the DVC study exercised knowledge based reasoning during their video viewing is less clear than the selective attention component due to the authors’ somewhat limited description of the actual analysis conversations themselves. We can, however, infer from the few example questions, and the authors’ mention of teachers posing solutions and alternatives, that teachers engaged in description, interpretation, and prediction while watching and discussing the videos. These are in keeping with Gaudin et al.’s description of “first level” reasoning. The structure of the DVCs also suggests that teachers had the opportunity to engage in “second level” reasoning (comparing visualized events with previous events). During the first cycle, they would have had the opportunity to connect and compare what they saw in the video with their own past practice. During the second DVC, the teachers would have had the, perhaps more explicit, opportunity to compare what they observed in the second videos with what they saw in the first videos. Whether, however, the facilitators deliberately capitalized on these opportunities for comparison remains unknown.

Objectives of video viewing in teacher education and professional development.

Gaudin et al. (2015) point to three major objectives for using videos during teacher PD. One is to model how to implement a practice (for example, were a video used in session 1 of the DVC process it quite likely would have been in this camp). A second objective is to teach participants how to interpret and reflect on practice. And the third is a hybrid of the two. In that they were used in more of a problem solving capacity – as examples, rather than exemplars – the videos used in the DVC reflect the objective of learning how to interpret and reflect. If, however, we consider the overarching goals of the DVC, along with the two-cycle structure, the objective of watching the videos becomes more of a hybrid. The teachers were asked to interpret and reflect on what occurred in the videos in the service of refining their practice for the second cycle (and beyond). Though no videos were specifically chosen as exemplars, it is conceivable that exemplary practice might have surfaced in the variety of clips observed. This suggests the possibility that the same videos could have been used to build capacity for best practice (the “normative” objective), as well as reflecting and interpreting (“developmentalist” objective) throughout the DVC process.

The nature of classroom videos viewed in teacher education and professional development  – and –
The effects of video viewing on teacher education and professional development

Gaudin et al. (2015) describe three types of videos that can be made available for viewing: videos that feature unknown teachers, peers, or one’s own activity. While they point to research that explores the advantages and disadvantages of each type, they emphasize that watching peer and self videos may be the most productive in that they encourage teachers to “ ‘know and recognize’ themselves” (Leblanc, 2012 as cited in Gaudin et al., 2015, p.51) and “‘move toward’ new and more satisfactory ways of teaching” (Gaudin et al. 2015, p.51). In the DVCs, the teachers watched videos of themselves and their peers conducting specific lessons and were guided to identify and reflect upon the effectiveness of the observed strategies for classroom discourse. The results of their approach mirror the affordances that Gaudin et al. describe, particularly in the area of teacher motivation. In Gröschner et al.’s final round table discussion, for example, teachers who had enrolled in the more traditional PD, in which videos were not viewed, expressed a desire for more direct feedback on their own teaching. In contrast, teachers who had participated in the DVC were satisfied with this element of their experience. In their final analysis, Gröschner et al. found teachers in the DVC group had stronger feelings of competence and satisfaction than those in the control group, which aligns with what Gaudin et al. describe in their analysis of similar research on this topic.

Response 3/5

Gaudin and Chaliès discuss four aspects of video viewing as a strategy for teacher professional development: “the nature of teachers’ activity as they view classroom videos,” “the objectives of video viewing in teacher education and professional development,” “the type of video viewed in teacher education and professional development,” and “the effects of video viewing on teacher education and professional development.” Here we want to consider the Kiemer and Gröschner, et al. study of a PD intervention constructed around video viewing through these four lenses.

With respect to the nature of teachers’ activity, Gaudin and Chaliès look for active, rather than passive, engagement with the video. In particular, they prioritize evidence of selective attention and knowledge-based reasoning. Kiemer and Gröschner do not provide much information about the interactions that took place in the course of their workshops, focusing instead on the evolution of teachers’ answers to questions on the pre-, mid- and post- questionnaires that surrounded the workshops themselves, as well as on the effects of the PD after its conclusion. They do however, talk about wanting to engage the teachers in the same ways that they hope the teachers will go on to engage their students, so they have an activity designed to “active students verbally and to clarify discourse rights” and one to “scaffold students’ ideas.” It is unclear what the content of discussions during those activities was, but from the later feedback sections, it seems that teachers felt they were being given “tips and suggestions about things you can change quickly” which sounds like less of a constructivist/cognitive dissonance approach (which might focus first of selective attention), and more of a knowledge-based reasoning focus (looking at what the teacher does in the video to reason through, and get feedback on, what he or she could do to be more effective).

The objectives of video viewing in the Kiemer and Gröschner study are more clear. They want to help teachers build productive classroom discourse, through open-ended questions and feedback, in order to promote student interest, and thus motivation and learning outcomes. This goal seems to fall into Gaudin and Chaliès’s “normative” bucket, helping teachers to reflect on and develop their practice not with the intent of promoting ongoing self-directed reflection, but rather with a focus on leading teachers to come away with intent and strategies for leading student discussions more “correctly.” On the other hand, they do look at teachers’ perceived autonomy, suggesting an interest in their self-guided learning, yet their desired outcomes are all stated in terms of changes in teacher practice and in student outcomes. Additionally, the videos are all certainly “examples, not exemplars” so they are not shown as “what to do” videos, but are nonetheless used as a jumping off point for discussions of “what to do.”

Next, the nature of the classroom videos are again quite clear. Kiemer and Gröschner use videos of the teacher participants themselves, so, presumably, the teachers see videos of themselves as a main focus but also video of their peers who are also participating in the same workshops. The workshops provide the community of support recommended for viewing videos of one’s own teaching, as well as the atmosphere of productive discourse that is scaffolded by the facilitator. The facilitator also pre-selects the clips from the videos to be watched in the workshop, reflecting the need described by Gaudin and Chaliès for more preparation and scaffolding that when watching others teacher. As all of the participants are mid-career teachers, rather than student teachers, using videos of the teachers themselves also fits into Gaudin and Chaliès’s “continuum of teacher professionalization” which suggests that they are ready for such introspection even while earlier career teachers might not be.

Finally, Gaudin and Chaliès see common effects of video viewing as enhancing teacher motivation and teachers’ selective attention and they make particular note of the fact that “little empirical evidence has been presented on how video use benefits actual classroom practice.” Yet, Kiemer and Gröschner’s second article specifically explores the effect of the PD intervention on teacher practices and student interest and motivation, which they also acknowledge as being unique among research papers in this field. For the most part, they do find positive outcomes relative to their objectives. Teachers used feedback more effectively to promote student discourse in their classes and students were found to have more interest, as well as sense of autonomy and competence. Gaudin and Chaliès’s discussion of indirect evidence about teacher practices and student outcomes (as well as their direct mention of Kiemer and Gröschner’s study in this section) suggests that these findings are in line with Gaudin and Chaliès’s ideals for the effects of video viewing in a successful PD intervention.

Response 4/5

Gaudin and Chaliès (2015) analyzed and categorized 255 studies of the use of video in professional development along the following four dimensions: 1) the nature of the activity teachers engage in when viewing video during professional development, 2) the goals of having teachers view such video, 3) the types of video used, and 4) the effects of viewing video in professional development. Using this four-part conceptualization, one can analyze and summarize any program of professional development, including that known as the “Dialogic Video Cycle” or DVC (Gröschner et. al, 2014; Kiemer et. al, 2015).

The Dialogic Video Cycle is a professional development program that uses video to support teachers in shifting the nature of the discourse in their classrooms. The DVC consists of two cycles of professional development, each of which is comprised of three workshops. In the first workshop, teachers work in collaboration with one another in modifying a lesson plan that they then implement in their classrooms. Implementation of this modified lesson plan is filmed in each teacher’s classroom. Clips from these video records are then selected by the DVC facilitator and shown to teachers in the second and third workshops of the DVC. In workshop #2, teachers focus on the types of questions posed by teachers to students in the videos viewed, paying particular attention to whether the questions posed are either open (e.g., what do you think happens if we heat it up?) or closed (e.g., do we have any right angles here?). In workshop #3, on the other hand, teachers are asked to focus on the sorts of feedback provided by teachers to students in the videos, as well as share ideas for how to take up students’ correct and incorrect answers. The second cycle of the DVC consists of three similar such workshops that revolve around the teaching of a different lesson plan.

While teachers in the DVC do not select the video clips viewed in workshops #2 and #3, consistent with the core features of effective professional development (Desimone, 2009), they play an active role in this particular program. Throughout both workshops, teachers are asked a series of questions by the professional development facilitator and are generally encouraged to reflect on their experience delivering the lesson that was filmed. Additionally, teachers are encouraged to ask clarification questions of the teacher in the video being viewed, which that teacher can then respond to by providing necessary explanations or describing contextual factors in greater depth.

The objective of engaging teachers in the DVC professional development program was to support them in moving towards a more dialogic model of discourse, in which both teachers and students co-construct meaning together. This objective was pursued as dialogic classrooms are believed to do better at enhancing students’ interest in relevant subject matter than classrooms in which the teacher adopts a more didactic, uni-directional pattern of discourse (Kiemer et. al, 2015). As such, the primary objective of this PD program, to encourage teachers to change their discursive practices, was pursued as success in meeting this primary objective was expected to result in success in meeting the second objective, enhancing students’ interest in their learning.

The video viewed by teachers in the DVC consists of records of the teachers themselves teaching a lesson they had modified previously in collaboration with one another. Specifically, teachers view video-clips selected “on the basis of the criteria of productive classroom discourse” (Kiemer et. al, 2015, p. 96). Stated differently, selected video-clips are chosen as they will presumably engender rich conversation among teachers about both the questioning behaviour of teachers in the video viewed (workshop #2) and the nature of the feedback provided by teachers in response to student contributions (workshop #3).

According to Kiemer et. al (2015), as a result of the DVC professional development, teachers’ practice did, as hypothesized, change in notable ways. While teachers did not come to ask more open-ended questions as a result of having taken part in the DVC, they did demonstrate significant improvement with regards to the type of feedback provided to students. At the conclusion of the DVC, teachers who took part in this particular development program provided less feedback that simply told students if an answer they had given to some question was right or wrong (i.e., simple feedback) and increasingly gave feedback that highlighted what was right or wrong about an answer, as well as how such an answer could be improved (i.e., constructive feedback). Additionally and as expected, students in the classrooms of teachers who participated in the DVC PD demonstrated an increased interest in the subjects that their teachers came to teach in a more dialogic manner (Kiemer et. al, 2015).

Response 5/5

The implementation of what van Es and Sherin (2009) call “video clubs” has elements that can be critiqued by Gauding & Chaliès’s (2015) four main conceptualizations of the use of video viewing in professional development. The video clubs are an example of using all four principles in varying degrees, but van Es and Sherin note that the focus was on analyzing student thinking rather than implementation of new methods or changing of teachers’ beliefs (p. 159). This focus for the video clubs has both affordances and limitations when analyzing it against the four principles framed by Gaudin and Chaliès.

First, the nature of the teacher activity while viewing the videos did have a specific focus on describing what they identified as student thinking and gave the teachers a structure to interpret using evidence. Attention to this principle was analyzed and showed some of the highest learning opportunity for teachers. In fact, Gaudin and Chaliès highlighted video clubs as a model for the elements that contributed to increasing a teacher’s capacity to reason (p. 46).

Second, video clubs had specific objectives in the professional development around student thinking. However, the paper did not report whether the researcher had in mind the objective of “best practices.” The clips were selected for the “potential to foster productive discussions of student mathematical thinking” (van Es & Sherin, 2009, p. 160), but it was not clear whether there was attention to the practices that may have attributed to higher levels of discourse around a problem. As van Es and Sherin report an increase teachers attending to student thinking in their analysis, it was not clear whether their learning goal of focusing on student thinking could have been better served by having teachers critically discuss the practices that are associated with student thinking. In this, it is not clear whether a hybrid objective could have better served their main focus.

Third, Gaudin and Chaliès note the limitations of having teachers analyze peer’s professional practice (p. 51). In video clubs, the analysis does show a shift in focus of the student in discussions, but it is difficult to infer whether watching a peer could have dampened the depth of these discussions. It was also not clear how teachers might have been scaffolded into watching each other or whether there were practices that could have been critiqued to further increase teacher’s perceptions of student thinking. Gaudin and Chaliès features studies that conclude that first videos should be selected and organized by viewing an unknown teacher (p. 52). Mainly, the video clubs go directly into viewing peer videos, and it could have had an effect on the shift in the conversations and depth of teacher analysis that they are reporting in their study.

Fourth, van Es and Sherin do show an association between video clubs and practices such as the following: attention to student thinking while teaching, knowledge of curriculum, changes in teachers’ instructional practices, and opportunities for student thinking. Although not all the aspects of the effects of video viewing that Gaudin and Chaliès discuss are explicitly addressed in video clubs, there does seem to be an increase in the teachers’ motivation and cognitive abilities. For example, some teachers report learning more about the curriculum and others “positioning themselves as learners in the classroom” (p. 171).

In all, although I may be looking at aspects that are not considered in video clubs proposed by the four conceptualizations, overall the video clubs have a design and many outcomes that are sound when analyzed from Gaudin and Chaliès’s framework. I found video clubs as having a good base for future professional development programs using video viewing. Considering the four principles, fine-tuning video clubs could have great promise for teacher learning.

Beyond Bits and Atoms – Week 5 – Makerspace Final Paper Assignment


Final revision of previous assignment.

Link to Google Docs with proper formatting

Makerspace for Classroom Teachers
Lucas Longo – 2016


Makerspaces in schools provide an unique opportunity for teacher professional development, in all disciplines. They may provide a situated learning experience where teachers can recall the difficulties students have in the process of learning and thus reflect upon their own teaching practices. Given that the majority of teachers are not familiar with fabrication and electronics, they are put back into the beginner’s seat, providing the possibility of reflecting metacognitively about learning and teaching. Accompanied with engaging discussions and activities of grounded on the affordances the activities provide, teachers learn about the best practices of teaching through modeling and engagement in practice.

The nature of the activities in makerspaces range from exploring, designing, building, and asking questions – all traits considered desirable in today’s research in education. What if we could apply these features into an English Poetry class? How can we promote transfer from the teachers experiences in the makerspace into their everyday classroom activities? I propose a PD curriculum that through engaging in makerspace, teachers are provided the opportunity to reflect on how learning happens, how to transfer these ideas into their own practice, and ultimately affect learning outcomes of their students in their own disciplines.

“In fact, the richness of makerspaces comes not from the fact that the abstract is left out, but that it is brought in together with new ways to build relationships with and between objects and concepts. ” (Blikstein & Worsley, 2001, p.5)

The Space

To explore this idea of using makerspaces as a learning environment for teachers I went to a public elementary school in Palo Alto where I interviewed the lab coordinator, who I shall call Jane. She walked us through the stations, described the activities students engage with in the space, and how she actively helps classroom teachers use this space as a learning environment in their disciplines. She used to be a science teacher and ran computer programming workshops after school before she created the maker space at the school. Her digital literacy and technical knowledge were surmount in her appointment for the task and as I see it, essential to be able to design the space, choose the tools, and use them for didactical purposes.

The space itself was a regular classroom converted into an open space with the working stations along the wall and low tables covered with paper where the students plan and work on their iPads. An outside area is also used for larger projects and is where all the Lego bricks are stored. Jane transformed it into a Makerspace by getting rid of all the closed cabinets along the walls, cutting the table’s legs making them more accessible to the students, and installing shelves to store material and student’s work. This was the first makerspace in Palo Alto’s District created around one and a half years ago, and now is being used as a model for other schools. An interesting concept that arose from the conversation was that some schools who do not have a full classroom for a makerspace are using karts with equipment and material that is circulated amongst the teachers allowing them to use the tools in their classroom.

Jane also mentioned that she has worked at the richest school in Palo Alto where they still do not have a Makerspace. ‘They are still thinking about the color of the furniture that will go there’ and ‘the teachers are not aware that the administration is even thinking about or planning to create a makerspace.’ ‘Teachers are not bought into it yet – they have the money but nothing happens. Let the kids do it – figure it out what is needed – put in action – do it.’ Her maker mentality needs to be somehow transmitted to the other schools. Freire would appreciate her statements in the sense that she is providing an open space for dialogue and relinquishing control over the experiences the students have in the space. She embraces the notion that the students are responsible for their own projects and that the teachers learn as much from them as they might from the teacher.

This transformative approach to teaching is promoted by her not only at her school but also in more formal PD sessions she holds at the space. At the district level she teaches “iPad in the Classroom”, “How to use Google Docs”, and “Schoology”. Every year she creates new courses to match the current software needs the teachers might have. Her drive and content knowledge applied to ‘spreading the word’ seems to be the key factors in this space’s success, and ultimately its sustainability. Any learning environment needs someone who will skilfully become a caretaker, curator,  facilitator, and enthusiast.

The following pages contain some photos of the space and a floor plan to situate the tools. I also recorded the interview and transcribed it loosely, categorizing by topic what was said. Please refer to the Raw Data for the full transcript of the interview.


IMG_1882.JPGGeneral view of the maker space

IMG_1883.JPG3D printer, object scanner, Lego Mindstorms

IMG_1884.JPGSmartboard, document camera, 3D printer

IMG_1885.JPGRobots, Lego Mindstorm, and mini-drones

Screen Shot 2016-01-30 at 11.17.34 AM.pngMakerspace floorplan

The arrangement of the space is divided into tool stations along the perimeter and working stations in the central area. Even though the space might look ‘messy’ it conveys a message of open exploration where all the tools and supplies are readily available for use. There is no check-out sheet of any kind or locked cabinets to which students need to request access to. The space itself is open to students during lunch time and after school promoting the idea of free access and empowers the students to decide when to work on their projects.

Curricula Integration

Jane uses the space as a learning environment for the classroom teachers at the school. She actively engages with the classroom teachers to create learning experiences in the makerspace. At Grade level meetings, she occasionally pops in and gives them ideas of activities and projects they could do in the makerspace that would enhance the learner’s experience with the subject matter. Understanding the affordances provided by the space and the teacher’s current topic of study provides a rich collaboration and an even richer experience for the students. Students for example, are encouraged to document their entire process, from design to final product. With these pictures, notes, and videos the students can ‘reflect back and see how they could do it differently next time.’

We could transfer this concept into PD by stressing the importance and value of formative assessment techniques the teachers could use in their own practice. Another example of this kind of transfer can be extracted from the fact that at lunch time all age groups are together in the space, where peer-to-peer teaching is evident. ‘They observe each other and learn from each other – even the older kids learn from the younger kids.’ A teacher PD designed in a makerspace would facilitate and demonstrate the value of collaboration and group work.

Before the space was built, the teachers ‘apprehensive and did not know about expect’ but they now see the student’s ‘excitement and learning’.  

‘They all get on-board once they see this happening. Classroom teachers think that they have to learn it all themselves. The mindset has changed. Now they know that the kids know more about the apps and the space than they do so they are willing to relinquish control and let them figure it out.’

I found it particularly interesting to note the teacher’s fear of not knowing how to use the makerspace’s tools. Implicitly, this hints towards a ‘banking model’ (Freire, 1970) of education where the teacher believes they are the holder of all knowledge that must be deposited into the student’s mind. In the makerspace, teachers have to dive into to the tools and learn with the students to enable them to progress with their projects. Suddenly the students know more than the teacher about a particular tool and more importantly, about their own project, intentions, and goals. The teacher becomes a facilitator instead of a fact presenter.

I am not suggesting that all of the English Poetry class should be taught in a makerspace or that we have to throw away the current method of teaching that class. The central idea is to use the makerspace as yet another possible affordance in the engagement of students with the content. The challenge is to demonstrate to classroom teachers that it is possible to learn not only from books, lectures, and in-class in-control situations. It is also ‘more work’ to think about lesson plans, design activities, and ultimately integrate the traditional and known curricula to lesser known methods, tools, and environment.


Creating a makerspace in a school must be accompanied with a proficient leader who will be able to talk-the-talk and walk-the-walk. Not only this person must be fluent in the lingo of the technological tools, they must also see that sewing machines, cardboard, wood block, glue, and other low-tech materials provide excellent resources in a makerspace. In that sense, the makerspace can very well be thought of as an “arts and crafts” space with some new tools for creative expression and active learning. The driver must be to facilitate the guided creation of projects students engage in related to the content and learning goals required in a classroom teacher’s discipline.

Along with understanding these core drivers that sustain the children’s interest in the space, this person must also engage classroom teachers in this experience actively promoting experimentation. This role involves exposing teachers to the possibilities the space offers in terms of types of projects, activities, and processes. Initially the teachers might need help with coming up with ideas of integrating their discipline with these possibilities and designing learning experiences to achieve existing learning objectives.

Jane’s pragmatic personality, hands-on approach, and technological know-how are essential characteristics for creating a sustainable makerspace. She is continuously making efforts to spread the word and to help teachers figure out how to best use the space with integrated constructivist activities encouraged in makerspaces. It is a process that takes time, determination, and enthusiasm.

Even though a direct link between the makerspace experience and an improvement in academic performance may be hard to be measured precisely, the effort seems to be rewarding and promote growth beyond test results. It teaches both the children and the teachers to co-create, explore, investigate, and ‘make’ their own learning happen.

Raw Data:

Interview Transcript – Smita Kolhatkar – Barron Park Elementary School – Jan 2016

Physical environment

  • Transformed the classroom into a Makerspace by clearing cabinets, cutting the tables to be closer to the ground
  • Space is open during lunch – a free for all
  • Computers and iPads remain on the tables while the gluing station and making areas along the walls
  • This was the first makerspace (1.5 yrs old) and now being modeled – those who do not have the space, they have karts with equipment on them
  • Smallest and poorest school in Palo Alto – 30% “Free Reduced”(???) Lunch, 30% EL, a lot of special needs students
  • She worked at the richest school in Palo Alto but they still don’t have a Makerspace because they are still thinking about the furniture
  • The teachers are not aware that they are thinking planning to start a Makerspace
  • Teachers are not bought into it yet – they have the money but nothing happens
  • Let the kids do it – figure it out what is needed – put in action – do it
  • 50 kids every day at lunch
  • Lunch 12:25 to 1:00

The learning task

  • Classroom teachers and her talk about what they could do for their classroom
  • Students document their entire process
  • Lego kits comes with curriculum
  • Pictures offer a closure of the project
  • Students prefer the tangible affordances of the robots – having to connect them to a computer in order to program them is an obstacle
  • At the District Level she teaches iPad in the Classroom, how to use Google Docs, Schoology
  • Use the Makerspace as a PD environment
    • Low enrollment this year – teachers have a lot to do – held 1 or 2 only in the past year
  • Our teachers were apprehensive before the space opened up – they did not know about expect – once it opened, they see the student’s excitement and learning they all get onboard
  • At Grade level meetings, she pops in and gives them ideas of what they could do with their classroom
  • As students progress through the years, they come into with previous knowledge and are able to dive into making
  • All age groups get together during lunch time – lots of peer-to-peer teaching. They observe each other and learn from each other – even the older kids learn from the younger kids
  • They teach each other and it comes naturally to them
  • Minecraft – one boy simply observes a group of older students working on their project, learns from it, and makes suggestions about what they should do
  • Classroom teachers think that they have to learn it all themselves – the mindset has changed – now they know that the kids know more about the apps and the space than they do so they are willing to relinquish control and let them figure it out
  • Kids usually finish the projects – almost like an unsaid rule – when they get stuck you help them – but naturally invested in finishing the projects
  • There is no “I Can’t”
  • Exposure to all kinds of things is important
  • Everyone talks about letting kids following their passions but they do not know what passion is!
  • They have an iPad Squad from 5th grade that does updates and maintenance
  • Teach them to document the process by taking photos so that when we have them reflect back they can see how they could do it differently next time
  • Digital etiquette that comes with it is great as well
    • What to do when you search Google and something inappropriate comes up?


  • They already had a 3D printer, LEGO Mindstorm, computers, and iPads
  • Dash&Dot gave them robots
  • iPad apps are really easy to use and very powerful
  • Parents donated quite a bit of material
  • Schools get a tech budget and spend them how they want – teachers were involved in the decision making process
  • Chromebooks – not as intuitive, apps are still coming out, no camera, cumbersome to carry – “no brainer to choose the iPad over Chromebooks” – not as intrusive in the classroom: no screen standing up on the desk
  • Computers are only used for programming but more and more, the programming capabilities are more and more available on the iPads
  • Circuit kits are extremely popular
  • Dissection projects – tear down electronics
  • Laser cutters are too expensive and not very safe – require exhaust and all – ordered a Glow Forge
  • Minecraft – creation mode only
    • Since Microsoft bought it, they are promising lesson plans to be used in the classroom
  • Lego Mindstorms not so good because you need to program on the computer and newer versions come first for the PC – schools have Macs.
  • Make the most out of the resources we have
  • Kids need to be used to different devices – using the computer is good occasionally
  • Apps are easier but are still not quite there for 3D tools
  • Store pictures on Schoology

The students

  • Students are highly engaged – they never want to stop working
  • Kids come in and says “I want to make something today” – they look at the material and start making
  • Gender preferences start appearing in 4th and 5th graders where boys gravitate to Minecraft
  • 1st and 2nd graders – hard but possible to teach coding
  • Girls don’t enjoy sitting in front of the screen
  • All like the robots and the tangibles
  • Coding must be introduced early on for girls – otherwise would loose interest on it later on
  • Two girls started coding club at their middle school
  • Boys use the sewing machines – don’t even have to ask that
  • We love it because we can make anything we want
  • A lot of boys would even come after school to finish projects
  • They have phases – are into one tool at a time

The teacher

  • She was a classroom teacher and then given the task to integrate technology
  • She acts as the technology integrator
  • First EdCampSVMake focused on Making April 30th – Saturday – 9 to 3 at Barron Elementary – Aimed at educators –
  • Might help that she is a woman in reducing gender biases
  • Her blog –

Maker Studio Initial Equipment List by Smita Kolhakatar

  • Sewing station
    • Sewing machine
    • Loads of fabric
    • Sewing accessories
      • Thread
      • Bobbins
      • Needles
      • Buttons
      • Sewing pins
    • Yarn
    • Looms
  • Filming Station
    • Stands for Stop Motion
    • Props for Stop Motion from Plan Toys
  • Gluing Station
    • Glue
    • Glue sticks
    • Glue guns
    • Masking tape
    • Regular tape
  • Robots Station
    • Bee-bots
  • Supplies
    • Markers
    • Crayons
    • Color pencils
    • Pens
    • Markers
    • Scissors
  • General Materials for building
    • Corks
    • Popsicle sticks
    • CDs
    • Straws
    • Wood scraps
    • Filters
    • Empty cartons
    • Cereal boxes
    • Pegs
    • Pipe cleaners
    • Bottle caps
    • Lots of empty boxes
    • Stuffing
  • Keva Planks
  • Stuffed toys
  • Circuitry
    • Battery packs
    • LEDs
    • Wires
    • Battery cells
    • Snap Circuits
    • Dough for squishy circuits
    • Makey Makey
    • Building kits
  • Movable whiteboards
  • iPad mini
  • MacBook Air
  • Makerbot Replicator 2
  • Makerbot Digitizer
  • LEGO NXT class kit
  • LEGO Storystarter class kit
  • Arduino kits
  • Soldering Kits
  • Laptops
  • Make Wonder Dash and Dot robots
  • Furniture
    • Tables (Low)
    • Tables (High)
    • Chairs
    • Cupboard
    • Built in counter space
    • Wall shelving


Ackermann, E. (2001). Piaget’s constructivism, Papert’s constructionism: What’s the difference. Future of learning group publication, 5(3), 438.

Blikstein, P. & Worsley, M. (2014?) Children Are Not Hackers.

Freire, P. (2000). Pedagogy of the oppressed. Bloomsbury Publishing.

Beyond Bits and Atoms -Dream Toy Assignment 1


The assignment is to interview a child and build a toy with educational purposes. Worked with Arron Broder on this one.

Dream Toy Assignment
Aaron Broder & Lucas Longo – 2016

Interview report

José is a 14 year old boy who is interested in architecture and making a new house for his turtle. When we arrived at the lab to interview him, he had been there already for an hour. He has been coming for the past four Tuesdays for unguided play. In front of him was a sheet of paper with a small square labeled 30cm and 50cm on each side. The carefully handwritten title read “Foldable Turtle House”.

Our initial interview plan included talking about what he did after school for fun, if he had any hobbies, and what games he played on his computer or mobile phone. We would then probe him about the tools and kits he had already played with in the lab and possible projects he might be interested in doing. His drawing though, took over the conversation.

His 5 year old turtle’s current house is old and he wants to build a new one. Problem is the turtle is in Brazil and therefore he must somehow be able to take this house on the plane. He seemed to be stuck on his effort to figure out how he could possibly build a foldable structure. We suggested that it did not necessarily had to be foldable, but could be disassembled, making it much easier to build.

To move things along we started talking about what features he wanted in this house. He wanted a simple box that would contain the turtle at night. He thought it would be fun to have a ramp she could go up onto a platform where she could see beyond the walls of the box. What else did his turtle need in this house beyond the structure? “Her food and water bowls are there and I put some soil on the bottom of the house because it’s softer for her to sleep on.”, he said.

He got excited with our suggestions of putting a force sensor under the bowls to detect when they were getting empty. “And then we make an app that alerts me when that happens!” Problem was that this only happened two or three times a week. We talked about a door that would open if the turtle wanted to get into the house – which also would never happen, he said. We were having a hard time trying to figure out how to transform this simple house into something that would provide him with a learning experience.

We explained to him that the goal of this project was to create a ‘dream toy’ with a learning goal. He replied that he wanted to learn how to make a toy, instead of the toy itself. He has played with Lego, remote control cars and planes, Minecraft, and other toys during different phases but nowadays it’s mostly his computer and smartphone that supply him with entertainment. He then mentioned that he and his father built the current turtle’s house. At this point we realized that this would be the most relevant project for him – the show to his father how much he learned and made in his visit to the US, along with giving his beloved turtle a new house, of course.

Ideation and initial prototype

José would be glad if we built this house for him but would love if he could learn how to do it himself. We immediately thought of teaching him how to do it, yet what would then be the deliverable for this assignment? If the course was “Curriculum Construction”, our product would be the course. We had to come up with a tangible object. Our initial brainstorm came up with a kit that he could assemble on his own. At his age though, that would be too easy and little, if any, learning involved. We wanted to stimulate his creativity with scaffolding, not directions on how to build this house.

How about giving him modules that fit into each other allowing him to build any kind of structure he wants? Like Legos? Ok… next idea…

We then realized that instead of giving him Legos, we could give him a tool that he would build his own Lego pieces. Instead of giving him the block, give him the mold that makes bricks. This way he could size and shape the house to his liking, but be scaffolded in the trickiest part of building – joints and connectors. We thereby created a template that provides the shapes of these joints and connectors with which José could trace out the contours onto cardboard and cut out the pieces.

We named it the Template Realization Tinkering Lab (TRTL) Construction Kit. Basically a piece of wood with cut outs that allow for tracing the basic shapes of the connectors and then cutting them out. TRTL therefore allows for infinite variations and true exploration of building, scaffolded by this template. It allowed us to shift from building a toy for José to providing a tool that will enable him to learn about size, scale, and structure while activating his own creativity and exploratory nature of making.

A future development of TRTL will include the GoGo Board as part of the template to enhance even further the creation possibilities of the tool. This would require a handbook or instruction manual as to how to use and set up the Gogo board with examples of applications for the different sensors and how to write code to make it all work.

In general we learned that creating a toy is not as challenging as creating a learning experience. Toys should be ‘fun’ for the sake of ‘fun’. A learning experience not only could be fun but must have learning objectives that takes into consideration the developmental stage of the learner, interests, motivations, previous knowledge, content relevance, cultural context, socio-economic status, and several other factors that will determine its effectiveness. In this sense, we believe that TURTL may attend to several contexts due to its simplicity and maleability in so far as it is a template that allows for open-ended creations. We now need to test the tool and observe it actually helps in the learning process of building structures.

Documentation of the prototyping process

Screen Shot 2016-02-05 at 7.40.23 PM.pngFigure 1: Starting to sketch the construction kit

Screen Shot 2016-02-05 at 7.40.32 PM.pngFigure 2: Playing with a destructible turtle house

Screen Shot 2016-02-05 at 7.40.38 PM.pngFigure 3: Talking sensors with Engin

Screen Shot 2016-02-05 at 7.40.45 PM.pngFigure 4: Moving from a blueprint to a template kit

Screen Shot 2016-02-05 at 7.40.50 PM.pngFigure 5: Planning the joins and labels for the template kit

Screen Shot 2016-02-05 at 7.40.57 PM.pngFigure 6: Starting to draw the prototype in Coreldraw

Screen Shot 2016-02-05 at 7.41.06 PM.pngFigure 7: A cardboard prototype

Screen Shot 2016-02-05 at 7.41.10 PM.pngwFigure 8: Iterating on the instructional diagrams for the next prototype

Beyond Bits and Atoms – Week 5 – Reading Notes


DiSessa, A. A. (2001). Changing minds: Computers, learning, and literacy. Mit Press.

  • Computer Literacy
    • Common term attributed to being able to turn a computer on, insert a CD, or use the mouse for example
  • Material Intelligence – Literacy
    • Intelligence achieved cooperatively with external materials
  • Infrastructural Knowledge
    • Content that is widely adopted and used as basis for new content (e.g. Calculus)
  • Evolution of material intelligence
    • Galileu’s Theorems took pages to be described by him
      • “Theorem 5 – If two particles are moved at a uniform rate, but with unequal speeds, through unequal distances, then the ratio of the time intervals occupied will be the products of the distances by the inverse ratio of speeds.”
      • t1/t2 = (d1/d2) / (r2/r1) (time, distance, rate)
      • No Algebra at the time
        • Only in the 20th century Algebra became widely adopted
      • Lack of mathematical notation – or material intelligence
  • What are the possible future literacies?
  • Romance-novels being read in subways – a social niche – factors influencing its adoption
    • Require being able to read
    • Most readers are women
    • Romantic love as an accepted genre
    • No sanctions against it (e.g. Playboy magazine)
    • Price of printing and revenue share with authors
    • Printing press
    • Uncrowded trains
  • Definitions
    • “A literacy is the convergence of a large number of genres and social niches on a common, underlying representational form.” (DiSessa, 2001, p.24)
    • “Genre is to social niche as species is to ecological niche.” (DiSessa, 2001, p.24)
  • Perspectives on social niches
    • Values, interests, motivations
    • Skills and capabilities
    • Materials
    • Community ad communal practices
    • Economics
    • History

Wilensky, U. (2010). Restructurations: Reformulating Knowledge Disciplines through New Representational Forms. Learning Sciences, Computer Science and Complex Systems, Northwestern University

  • Structuration and Restructuraiont of a discipline
    • From Roman to Hindu-Arabic numerals
    • “the encoding of the knowledge in a domain as a function of the representational infrastructure used to express the knowledge.” (Wilensky, 2010, p.2)
  • Core properties of structurations
    • Power properties – must do what was done before but better
    • Cognitive properties – must be easier to learn
    • Affective properties – memes – ideas that spread in an evolutionary manner through society, social niche, or culture
    • Diversity properties – must attend to all ‘intelligences’ and people’s style
  • Circle can be described in several ways
    • All points are at the same distance form a point called center (Euclid)
    • The formula to plot a circle is x^2 + y^2 = K (Descartes)
    • Logo turtle – if constant linear and angular speed is maintained, a circle is drawn
    • Logo turtles – place many of them in a central point and have them all go straight for the same amount: circle.
  • Agent-based modeling
    • Observe a phenomenon and try to create an equation that fits the observed data
    • Agents have individual procedures which affect the larger population
      • Lynx-Hare Example
      • The Tick Model – Newtonian Physics and Beyond
      • GasLab – Statistical Mechanics and beyond
      • MaterialSim – Materials and Beyond

Wilensky, U., & Resnick, M. (1999). Thinking in levels: A dynamic systems approach to making sense of the world. Journal of Science Education and technology, 8(1), 3-19.

  • Agent-based models
    • Traffic-jam example on agent-based modeling
      • Cars are moving forward all the time but traffic itself is flowing backwards
    • Waves
      • Particles themselves move perpendicular to the direction of travel of the wave
  • Levels
    • NOT hierarchical levels
    • Individual level has its own behaviors and properties
      • Interactions of many of these individuals create emergent properties – a new level

Curriculum Construction – Week 5 – Curriculum Rationale Assignment



Directions for Curriculum Rationale

Each group will submit a curriculum rationale that includes the following components:

  • Information about the Site

Give a brief description of the site for which you are constructing your curriculum.  What do you know about the context that will influence what you produce? What do you know (if anything) about the person/people who will be implementing your curriculum?

  • Ideology/Theory

What ideologies or curriculum theories undergird your curriculum?  How do these ideologies/theories influence the design of the curriculum? Be sure to cite particular theorists as appropriate.

  • The Learners

Who are they?  What do you know about them?  What assumptions are you making about how they learn and what is important for them to know?

  • Overall Rationale

Identify the overall why, what, and how of your curriculum and explain why you made these choices.  This section may include a rough outline of topics to be covered and possible scope of the unit. It should be clear how the rationale fits the setting and is appropriate for the learners that you’ve described.

Please bring hard or electronic copies of your rationale to class next week for peer review.  The instructors will also give you feedback about the rationale, so please send us an electronic copy as well.



Curriculum Rationale
Celine Zhang, Lisa Jiang, Lucas Longo, Mohamad Haj Hasan

Information about the Site

The site we are working with is the Operations, Information & Technology (OIT) Department at the Stanford Graduate School of Business (GSB). The site is looking to add some online elements to their Base course in Data and Decisions. This course is an introductory course to probability, statistics and regression. The course is a mixture between theory, concepts and practical application in a business context. The course has been taught in the same way for the past 15-20 years, in a completely lecture-based format of mostly theory with limited hands-on application during class time. The culmination of the class is a practical project that is intended to model a real-world application of the concepts learned in class, and the students have an opportunity to work with real clients who have real data and decision needs.

The site would like to design and put all the theoretical and conceptual components of the course online, and utilize class time for more engaging practical applications, clarification of the online content and general discussion. The theoretical part of the course is almost perfectly suited for online consumption for the following reasons:

  1. Students with different backgrounds in the subject can learn at their own pace, repeating concepts and formulas as many times as they want.
  2. The content to be put online is very much “passive” in the sense that little is lost from a one-sided online lecture.

The site would also like to have some adaptive assessment solutions online that would act both as a feedback mechanism for students as well as an observational tool for the teachers as to how students are learning and progressing in the class.

The person leading this initiative is Allison O’Hair. Allison has experience in designing and implementing online content for MIT Sloan and is very knowledgable on the medium of online education. It is interesting to note that although the OIT Department is leading this initiative, the course is technically under the Economics Department at the GSB.


Our curriculum would be undergirded primarily by the Dewey’s concept of progressivism in that we hope to design learning experiences that drive students’ innate desire to learn. According to Dewey, the educator’s role is to set up the right conditions for transfer, rather than teach lessons in isolation. The sign of a mature learner is then someone capable of both identifying and solving their own problems. This focus on “transfer” to enable students to apply their learnings beyond the classroom would definitely be a key learning goal in our curriculum, with ample class time devoted to discussing practical applications of concepts and a real-world project for assessment.

In addition, Dewey placed great emphasis on the interaction between internal and objective conditions for curriculum design. He contended that curriculum construction was always contextual, and that “the trouble with traditional education was not that it emphasized the external conditions that enter into the control of the experiences but that it paid so little attention to the internal factors which also decide what kind of experience is had”. Personalizing the learning experiences for students based on their “internal” conditions, such as their backgrounds, prior knowledge, social-emotional skills and so forth, would be key to effective learning. Ideally, these learning experiences should help prepare students for later experiences and drive continued learning. Selecting and creating learning experiences – be it direct instruction, class discussions or assessments – that are tailored to students’ backgrounds will very much be a core consideration in our curriculum design. Our ultimate goal would be to equip our students with both the desire and skill to continue enhancing their understandings of probability, statistics and regression application in a business context.

In a similar vein to Dewey’s progressivism, Bruner’s theory of emphasizing structure, transfer, and students’ readiness for learning would also underpin our curriculum design. We agree with Bruner that the ultimate goal of education is to help students “learn how to learn” and facilitate transfer. The emphasis on structure, then, is critical because a deep understanding of structure is also a deep understanding of how things are related, and therefore permitting transfer. The implication for us would be to delineate the key concepts to be covered in the course and sequencing them in such a way that “earlier learning renders later learning easier … by providing a general picture in terms of which the relations between things encountered earlier and later are made as clear as possible”. In terms of readiness to learn, we are cognizant that the learners in our course may come equipped with different levels of understanding and aptitudes, and it is our hope to create a curriculum that provides satisfying learning experiences for students regardless of their existing preparedness for the course. We want our curriculum to result in a class that stimulates our students’ desires to learn.

The Learners

The learners are first year MBA students at the GSB (MBA1s). The Base level of Data and Decisions targets students with little or no background in the subject, however the class may have some students with decent background who have chosen to take the Base level instead of the Intermediate or the Advanced levels of the subject. The site is targeting a launch date of Winter 2017 to pilot the course, and it would be presented as an opt-in option for eligible Base students.

The learners come from a wide range of background knowledge and experiences. While most learners were exposed to some part of the content in high school or college, many of them may not have been exposed to or have applied the concepts at work. We would also assume that the learners have different computer skills. This is important because the course uses some Excel and quite a bit of R, a programming language and software environment for statistical computing and graphics. R is especially important for the final project where learners work with a real company and real data to help the company answer some critical business questions using the data. In addition, the learners are in the process of getting a wider business degree, and we assume they are more interested in the business applications of the content as opposed to the details of the formulas and their derivation. For example, it may be more beneficial to know the idea behind variance, how it’s calculated in Excel or R and its use as opposed to knowing the exact formula.

It is fair to assume that the MBA1s are also busy with many social and academic events, which means that there attention and dedication to the course will be spread thin. MBA students, in particularly, care deeply about ‘authentic’ learning experiences and will only devote their time and energy to topics that they perceive as having direct connections to their professional pursuits. We also assume that all MBA students have the appropriate technology affordances for online learning – access to high quality internet access and up- to-date computers to stream videos and run statistical programs.

Overall Rationale

The Data and Decisions course was originally designed as a support course, or prerequisite, for other courses such as Finance and Accounting. It was intended to provide a basic overview of how to use data to extract information that supports decision-making. Since then, specific topics have been added or subtracted from the curriculum to be more focused on data analysis than the calculation of probability and statistical procedures. Students will not only learn methods of using data but, more importantly, should be able to build models and critique them. The hope is that students will become intelligent consumers of data who can look at it and interpret it.

This shift towards decision-making based on data analysis is now central to the current redesign of the curriculum. The goal is to shift from the ‘teaching of formulas’ to doing problem sets, discussions, and application of core concepts. Given this shift in focus, we believe that the teaching of formulas and procedures tend to be more linear and repetitive and thus great candidates for being presented as online content, instead of using valuable classroom time.

Online content also corrects for student’s previous knowledge and pace. Problem sets can be personalized for each student’s level of understanding, thus ensuring everyone’s preparedness for the course’s learning progression. Discussion forums and peer-review mechanisms can also provide different learning opportunities for those who have different learning styles and prefer more collaboration or explanations in different ways. It can also serve as a great formative assessment for teachers to identify common misconceptions and course correct. The implementation of these features and what technological platform will be used remains undecided.

The original course content sequence is as follows:

  1. The first area, probability, provides a foundation for modeling uncertainties, such as the uncertainties faced by financial investors or insurers. We will study the mechanics of probability (manipulating some probabilities to get others) and the use of probability to make judgments about uncertain events.
  2. The second area, statistics, provides techniques for interpreting data, such as the data a marketing department might have on consumer purchases. Statistical methods permit managers to use small amounts of information (such as the number of people switching from Verizon to AT&T in an iPhone test marketing program) to answer larger questions (what would AT&T’s new market share be if the iPhone is launched nationally?)
  3. The third area, regression analysis, is the set of techniques that allow companies to build statistical models of different facets of their businesses. Examples include predicting which movies a customer may like based on her past movie ratings (e.g. Netflix), predicting the sales price of a house (e.g. Zillow), or predicting the sales response to a new ad (e.g. Google).

Original course grading

  • Class Participation Evaluation 10%
  • Mid-term Exam 20%
  • Homeworks 15%
  • Regression Project 20%
  • Final Exam 35%

The proposed course content sequence attempts to flip the sequence so that students have an end goal in mind and learn in a ‘need-to-know’ basis.

  1. Final project – Phase 1
    1. Show previous final projects as examples
    2. Explain what quality work looks like
    3. Show final project grading rubric
    4. Select a real company to obtain data from
  2. Regression analysis – Phase 1
    1. What is it
    2. Examples of how to use it
    3. Underlying concepts  
      1. Regression
      2. Statistics
      3. Probability
  3. Final project – Phase 2
    1. Data manipulation and clean up
    2. Desired data representations or key performance indexes
  4. Regression analysis – Phase 2
    1. How to do it with your own data
    2. Underlying concepts  
      1. Regression
      2. Statistics
      3. Probability
  5. Final project – Phase 3
    1. Data analysis
    2. Present project and results
    3. Peer-review sessions
  6. Conclusion
    1. Cases and further discussions
    2. Feedback from professor and company

Curriculum Construction – Week 5 – Reading Notes


McTighe, J., & Ferrara, S. (1998). Assessing Learning in the Classroom. Student Assessment Series. NEA Professional Library, Distribution Center, PO Box 2035, Annapolis Junction, MD 20701-2035.

  • Assess teaching and learning, not the student and grades
    • “The primary purpose of classroom assessment is to inform teaching and improve learning, not to sort and select students or to justify a grade.” (McTighe & Ferrara, 1998, p.1)
  • Latin roots
    • “the term assessment is derived from the Latin root assidere meaning “to sit beside.” (McTighe & Ferrara, 1998, p.2)
    • Assidere suggests that, in addition to tests and projects, classroom assessments include informal methods of “sitting beside,” observing, and conversing with students as a means of understanding and describing what they know and can do.” (McTighe & Ferrara, 1998, p.2)
  • Types of assessment
    • Tests
      • Rigid format: time limits, paper and pencil, silent
      • Limited set of responses: limited access to source material
    • Evaluation
      • Make judgements regarding quality, value, or worth
      • Pre-set criteria
    • Summative assessment
      • culminating assessment that provides a summary report
    • Formative assessment
      • Ongoing diagnostic
      • Helps teachers adjust instruction
      • Improve student performance
      • Determine previous knowledge
      • Determine ongoing understandings and misconceptions
  • Large scale assessment
    • Usually standardized tests
      • High-stakes
    • Educational accountability
    • Norm referenced
      • Easier interpretation
      • Comparison with others
      • Averages to determine your position
    • Criterion referenced
      • Compared to reestablished standards
  • Classroom assessments
    • Diagnose student
    • Inform parents
    • Improve practice
  • Effective Classroom Assessment
    • Inform teaching and improve learning
      • Performance-based assessments
        • Focus instruction and evaluation
        • Students understand criteria for quality
        • Students get feedback and revise their work
        • Peer- and self-evaluation
    • Multiple sources of information
      • Single test is like a single photograph
      • Frequent sampling
      • Use array of methods
        • Create a Photo Album instead of single photo at the end
          • Different times
          • Different lenses
          • Different compositions
    • Valid, reliable, and fair measurements
      • Validity: How well it measures what it is intended to measure
      • Reliability: If repeated, would you get the same results?
      • Fairness: give students equal chances to show what they know and can do without biases or preconceptions
    • Ongoing
  • Content Standards
    • Declarative knowledge
      • what do students understand (facts, concepts, principles, generalizations)
    • Procedural knowledge
      • what do we want students to be able to do (skills, processes, strategies)
    • Attitudes, values, or habits of mind
      • how we would like students to be disposed to act (appreciate the arts, treat people with respect, avoid impulse behavior)
  • Purpose & Audience
    • Why are we assessing?
    • How will the assessment results be used?
    • Who are the results intended for?

Screen Shot 2016-02-01 at 10.23.12 AM.png

  • Assessment Approaches and Methods
    • Approach – what do you want students to do?
      • Select a response
      • Construct a response
      • Create a product
      • Provide and observable performance
      • Describe their thinking/learning process
    • Selected-Response Format
      • Positive
        • Wide range of knowledge can be ‘tested’
        • Easy to implement
        • Easy to evaluate and compare
        • Fast
      • Negative
        • Assess knowledge and skills in isolation and out of context
        • Not able to assess critical thinking, creativity, oral communication, and social skills
        • Real-world does not have single correct answers
        • Focuses students on acquisition of facts rather than understanding and thoughtful application of knowledge
    • Constructed-Response Format
      • Brief Constructed Response
        • Short written answers
        • Visual representations
        • Positive
          • Students have a better opportunity to show what they know
          • Easier to construct and evaluate than other constructed responses
        • Negative
          • Does not assess attitudes, values, or habits of mind
          • Require judgement-based evaluation – low reliability and fairness
      • Performance-Based Assessment
        • Requires students to apply knowledge and skills rather than recalling and recognizing
        • Associated terminology:
          • Authentic assessment
          • Rubrics
          • Anchors
          • Standards
            • Content standards – what students should know
            • Performance standards – how well students should perform
            • Opportunity-to-learn standards – is the context right
        • Positive
          • Content-specific knowledge
          • Integration of knowledge across subject-areas
          • Life-long learning competencies
        • Negative
          • Do not yield a single correct answer or solution – allows for wide range of responses (also positive)
        • Types
          • Product
            • “Authentic” since it resembles work done outside of school
            • Portfolio to document, express individuality, reflect, observe progress, peer- and self-evaluation
            • Criteria must be identified and communicated with students
          • Performance
            • Can observe directly application of knowledge
            • Students are more motivated and put greater effort when presenting to ‘real’ audiences
            • Time- and labor-intensive
          • Process-focused assessment
            • Information on learning strategies and thinking processes
            • Gain insights into the underlying cognitive processes
            • Examples
              • “How are these two things alike and different?”
              • “Think out loud”
            • Continuous and formative

Screen Shot 2016-02-01 at 10.31.11 AM.png

  • Evaluation Methods and Roles
    • Scoring Rubric (Rubrica – red earth used to mark something of significance)
      • Evaluative criteria
      • Fixed scales
      • Description of how to discriminate levels of understanding, quality, or proficiency
      • Holistic Rubric
        • Overall impression of quality and levels of performance
        • Used for summative purposes
      • Analytic Rubric
        • Level of performance along two or more separate traits
        • Used in day-to-day evaluations in classroom
      • Generic Rubric
        • General criteria for evaluating student’s performance
        • Applied to a variety of disciplines
      • Task-specific Rubric
        • Designed to be used in a specific assessment task
    • Anchors
      • Examples that accompany a scoring rubric
    • Rating scales
      • Bipolar rating scales – bad & good, relevant & irrelevant
    • Checklists
      • Good to ensure no element is forgotten or attended to
    • Written and oral comments
      • Best level of feedback – communicates directly with student
      • Must not be only negative feedback
  • Communication and Feedback Methods
    • How to communicate results?
    • Numerical scores & Letter grades
      • Widely use but not descriptive
    • Developmental and Proficiency Scales
      • Contain description of quality and performance

Screen Shot 2016-02-01 at 11.51.44 AM.png

    • Checklists
      • Careful with poorly defined categories like creativity – open to interpretations
    • Written comments, narrative reports, verbal reports, and conferences
      • Communicate directly with each student
      • Time-consuming
  • Assessment not only measures outcomes but also invokes the values, the how, and the what of learning,
  • Great glossary at the end of this paper.

Coffey, J. (2003). Involving Students in Assessment. In J. Atkin & J. Coffey (Eds.) Everyday Assessment in the Science Classroom. Arlington, VA: National Science Teachers Association. pp. 75-87.

  • Assessment is an opportunity for learning
    • “Whether it comes after teaching, while teaching, or by teaching, we often think of assessment as something done to students, not with them.” (Coffey, 2003, p.76)
  • Teachers
    • check assignments and interpret student responses
    • listen closely to students’ questions so that they can gain insight into their students’ understandings
    • seek to make explicit the assessment criteria so that all students know how they will be evaluated
    • try to use what they learn through assessment to inform teaching, plan future learning activities, and provide relevant feedback
    • constantly gauge trends in class engagement, interests, and understanding
    • strive to fairly assign grades that accurately reflect what a student knows and is able to do.
  • Everyday Assessment
    • “Everyday assessment is a dynamic classroom activity that includes the ongoing interactions among teachers and students as well as more scheduled events, such as weekly quizzes and unit tests.” (Coffey, 2003, p.76)
    • “One of the many purposes of everyday assessment is to facilitate student learning, not just measure what students have learned.” (Coffey, 2003, p.77)
  • Key Features of Assessment
    • explicating clear criteria (Butler and Neuman 1995)
    • improving regular questioning (Fairbrother, Dilln, & Gill 1995)
    • providing quality feedback (Kluger and DeNisi1996; Bangert-Drowns et al. 1991)
    • encouraging student self-assessment (Sadler 1989; Wolf et al. 1991)
  • Responsibility for own learning
    • “When students play a key role in the assessment process they acquire the tools they need to take responsibility for their own learning.” (Coffey, 2003, p.77)
  • Low performing benefited the most
    • “Lower-performing students … showed the greatest improvement in performance when compared to the control class.” (Coffey, 2003, p.77)
  • Learning From Connections
    • “Through the students’ explicit participation in all aspects of assessment activity, they arrived at shared meaning of quality work. Teachers and students used assessment to construct the bigger picture of an area of study, concept, or subject mater area. Student participation in assessment also enabled students to take greater responsibility and direction for their on learning.” (Coffey, 2003, p.78)
  • Shared Meanings of Quality Work
    • Activities
      • students generating their own evaluation sheets
      • conversations in which students and teachers shared ideas about what constituted a salient scientific response, or a good presentation, lab in investigation, or project
      • discussion of an actual piece of student work
      • student’ reflections on their own work or a community exemplar
      • student’ decision making as they completed a project
  • Assessment as a Means to Connect to a Bigger Picture
    • “Teacher and student s leveraged test review as an opportunity to return to the bigger picture of what they had been studying. The class talked about what was going to be covered on the test o quiz so that all students knew what to expect.” (Coffey, 2003, p.84)
  • Assessment as a Vehicle to facilitate Lifelong Learning
    • “The test process also encompassed graded responses after the test, and students would often do test corrections after going over the test. On occasion students would write test questions and grade their own work.” (Coffey, 2003, p.84)
  • Creating Meaningful Opportunities for Assessment
    • Time
    • Use of Traditional Assessment
    • Public Displays of Work
    • Reflection
    • Revision
    • Goal Setting
  • Results
    • “Despite initial resistance, as students learned assessment-related skills, demarcations between roles and responsibilities with respect to assessment blurred. They learned to take on responsibilities and many even appropriated ongoing assessment into their regular habits and repertoires.” (Coffey, 2003, p.86)

Treagust, D., Jacobowitz, R., Gallagher, J, & Parker, J. (March 2003). Embed Assessment in Your Teaching, Science Scope. pp. 36-39.

  • Effective strategies for implementing embedded assessment
    • Use pretests
      • identify students’ personal conceptions
      • misconceptions
      • problems in understanding the topic
    • Ask questions to elicit students’ ideas and reasoning
      • “Acknowledge each student’s answers by recording them on the board or by asking other students to comment on their answers.” (Treagust, Jacobowitz, Gallagher, & Parker, 2003, p. 37)
    • Conduct experiments and activities
      • challenge their own ideas
      • write down their findings
      • share with their peers.
    • Use individual writing tasks
      • capture students’ understanding
      • teacher can assess their progress
    • Use group writing tasks
      • students work together to illustrate each other’s respective understanding
    • Have students draw diagrams or create models
  • Results
    • “25 percent of students in the class taught by one of the authors were rated “Proficient” on the MEAP Science Test compared to 8 percent of other eighth grade classes in the school” (Treagust, Jacobowitz, Gallagher, & Parker, 2003, p. 39)
    • “Moreover. students become more engaged in learning when their teacher gives attention to students’ ideas and learning. and adjusts teaching to nurture their development.” (Treagust, Jacobowitz, Gallagher, & Parker, 2003, p. 39)

Echevarria, J., Vogt, M, & Short, D., (2004). Making Content Comprehensible for English Learners: The SIOP Model. (2nd edition). Boston: Allyn & Bacon. pp. 21-33

  • Sheleterd Instruction Observation Protocol (SIOP)
    • Content Objectives
    • Language Objectives
    • Content Concepts
    • Supplementary Materials
    • Adaptation of Content
    • Meaningful Activities

Teacher PD – Week 5 – Class Notes


PD Design of DVC (Dialogic Video Cycle)

  • 2 groups – intervention group & control group
  • Cycle of 3 events and then repeated
  • Did not see so much about the “Prediction”

Video Clubs (Van Es)


  • noticing student thinking
  • confusion evidence to evidence it and engage in the videos
  • discourses of noticing – noticing framework
  • 1 year – 7 interventions – self selected
  • focused on student thinking rather than content
  • looked at comments while watching the videos
  • self-reports focused on how much they elicit student thinking
  • more developmental vs normative video use
  • looked at peer videos

Research Methods

  • research question was very open
  • who video-clubs would influence:
    • teacher’s thinking about student’s learning
    • looking at teacher as learner
  • video taped PD itself – transcripts
    • fine-grained analysis
      • actor – object of focus
      • topic – what were they talked about
      • stance – what type of discourse they engaged in
      • video based or non video based evidence
    • Results
      • teacher’s views shifted
        • actor – more about student than the teacher
        • topic – from classroom management to mathematical thinking
        • stance – from evaluation to evidence based discussion
      • classroom instruction were also video taped
        • coded for student and whole group discussions
        • changes in instruction
          • made space for student thinking
          • publicly demonstrated student learning
          • probed students for more evidence
          • learning about teaching
      • teacher exit interviews
        • chances in thinking and practice
          • look at student thinking
          • attending to student thinking
          • own school curriculum

Reading for Week 6 – Roth & NARST

Teacher PD – Week 5 – Reading Notes


Borko, H., Jacobs, J., Seago, N. & Mangram, C. (2014). Facilitating video-based professional development: Planning and orchestrating productive discussions. In Y. Li, E.A.Silver & S. Li (Eds.) Transforming mathematics instruction: Multiple approaches and practices (pp. 259-281). Dordrecht: Springer.

  • Use of video to discuss the teaching practice
  • Video watching must be skillfully guided
    • “To successfully lead such discussions requires that teachers have deeps knowledge of the relevant content, of student thinking about that content, and of instructional moves that are likely to guide the discussion in fruitful directions.” (Borko, Jacobs, Seago, & Mangram, 2014, p.261)
  • Best practices
    • Anticipating student responses
    • Monitoring their thinking
    • Selecting approaches for the class to explore
    • Sequencing student’s shared work
    • Connecting student responses to one another and to key ideas
  • Three decision points when planning a video-based discussion
    • Determine goals of discussion and select video clips
    • Identify goal relevant features of the video clip
    • Create questions to guide the discussion
  • Three practices for orchestrating productive discussions
    • Think about lesson segment
    • Probe for evidences of their claims
    • Connect analysis to key mathematical and pedagogical ideas
  • Content accompanying video for PD facilitators
    • Time-coded transcript
    • Lesson graph
    • Guiding questions to ask
    • Notes on the clip
      • “Back pocket” questions
      • Mathematical support
      • Cautionary notes
  • The need for a PD for PD facilitators

Gaudin, C., & Chaliès, S. (2015). Video viewing in teacher education and professional development: A literature review. Educational Research Review, 16, 41-67.

  • The need for facilitation in video-based PD
    • “How can teaching teachers to identify and interpret relevant classroom events on video clips improve their capacity to perform the same activities in the classroom?” (Gaudin & Chaliès, 2015, p.41)
  • Teachers must be trained to identify relevant events
    • “Most authors agree that enriching selective attention should be an objective of both teacher education and professional development. Indeed, both PTs and ITs suffer from an inability to identify relevant classroom events without training and focus.” (Gaudin & Chaliès, 2015, p.46)
  • Teachers must be able to
    • Describe
    • Explain
    • Predict
  • “Disposition to notice” and “capacities to reason”
  • Objectives of video viewing in teacher education and professional development
    • Show example of good teaching practices
    • Show characteristic professional situations
    • Analyze the diversity of classroom practices from different perspectives
    • Stimulate personal reflections
    • Guide/coach teaching
    • Evaluate competencies
  • Two main categories of video use
    • Developmentalist – how to interpret and reflect on classroom practices
    • Normative – what to do in the classroom
  • Select videos of “‘examples’ not ‘exemplars’”
  • Videos of
    • unknown teacher activity
    • peer activity
    • own practice
  • Effect of video viewing in TE & PD
    • Motivation
    • Cognition
    • Classroom practice

Gröschner, A., Seidel, T., Kiemer, K., & Pehmer, A.-K. (2014). Through the lens of teacher professional development components: The ‘Dialogic Video Cycle’ as an innovative program to foster classroom video. Professional Development in Education, DOI: 10.1080/19415257.2014.939692

  • How to teach “Productive classroom dialogue”
    • “Productive classroom dialogue refers to approaches to classroom communication in which teacher and students, through purposeful classroom talk, engage in a continual process of the co-construction of knowledge (Wells and Arauz 2006, Mercer and Littleton 2007, Alexander 2008).” (Gröschner, Seidel, Kiemer, & Pehmer, 2014, p.2)
  • Effective components of professional development
    • Content focus
    • Active learning
    • Collective participation
    • Duration
    • Coherence
  • Self-determination Theory (SDT)
    • “teachers’ abilities to foster perceptions of autonomy, competence and (social) relatedness.” (Gröschner, Seidel, Kiemer, & Pehmer, 2014, p.8)
    • “In the field of PD and workplace learning, studies found that autonomous motivation also supports job satisfaction and predicts the quality of transfer of PD experiences in daily work (Gegenfurtner et al. 2009).” (Gröschner, Seidel, Kiemer, & Pehmer, 2014, p.8)
  • Problem-Solving Cycle (PSC)
    • Iterative, long-term PD approach (Borko) focused on CK and PCK
  • Dialogic Video Cycle (DVC)
    • Builds upon PSC model
    • Focuses on verbal interactions between teachers and students
    • “In the DVC the focus is on the implementation of the two activities student activation and clarifying discourse rights and scaffolding student ideas and feedback (Walshaw and Anthony 2008). By helping teachers implement both activities in the classroom, the DVC aims to change the perspective of teachers towards engaging students in classroom dialogue and to support student learning processes.” (Gröschner, Seidel, Kiemer, & Pehmer, 2014, p.9)
  • “Therefore, through the lens of teacher PD components, video-based reflections as well as collaborative learning opportunities seem to be crucial aspects for teacher learning.” (Gröschner, Seidel, Kiemer, & Pehmer, 2014, p.25)

Kiemer, K., Gröschner, A., Pehmer, A.-K., & Seidel, T. (2015). Effects of a classroom discourse intervention on teachers’ practice and students’ motivation to learn mathematics and science. Learning and Instruction, 35, 94-103.

  • Motivation to learn
    • “Motivational concepts such as interest in the subject are important outcomes of educational processes (Krapp & Prenzel, 2011) and are key elements regarding the young generations’ preparedness for life-long learning as a core-skill in knowledge-based societies.” (Kiemer, Gröschner, Pehmer, & Seidel, 2015, p. 94)
  • DVC worked
    • “This study shows that after successful implementation (Gro€schner, Seidel, Kiemer, et al., 2014), the video-based TPD approach of the DVC was effective in changing teachers’ behaviour towards more productive classroom discourse.” (Kiemer, Gröschner, Pehmer, & Seidel, 2015, p.101)
    • “The results of this study further show positive changes in students’ experiences of autonomy, competence and social relatedness as well as intrinsic learning motivation, when their teachers participated in the DVC intervention.” (Kiemer, Gröschner, Pehmer, & Seidel, 2015, p.101)
    • “The results demonstrate the importance of productive classroom discourse in promoting positive learning outcomes for students’ motivational orientations and its role in fostering student interest in STEM subjects.” (Kiemer, Gröschner, Pehmer, & Seidel, 2015, p.101)