Qualitative Research – Final Reflection

Assignment

Individual Process Paper Requirements

This paper is a final reflection on the process of doing qualitative research. In this paper you should:

  • Describe your growth as a qualitative researcher over the past 10 weeks using concrete details and examples to demonstrate areas of growth as well as areas you are still mastering
  • Reveal how you are pushing yourself toward new understandings, especially concerning the complexity of the research process
  • Connect your experience to class readings and class discussions. Show us some key topic areas you are grappling with… Be sure to use proper APA format

You may want to revisit past RDRs and show how your thinking has progressed. You may want to reflect on topics such as contextual interpretation, subjectivity, ethics, the analysis process, validity, and rigor.

Process papers should be between 4 and 8 double-spaced pages, not to exceed 8 pages.

Group mini-products will be evaluated separately from individual process papers. We will average the group grade on the mini-product with the individual grade on the process paper. 

Response

Abstract

This paper is a review of the learning process I have gone through this quarter in this class in the form of a qualitative research paper. I propose to expose my journey from someone who had barely ever thought about research, let alone qualitative research, to someone who is now able to appreciate the power of this method of analysis of the world around us. Instilled with my own bias and metacognition, I will describe what were the salient concepts acquired through the readings, class activities, and assignments.

The research question I want to answer is: “How does Lucas understand the qualitative research process?”

Introduction

As an engineer undergraduate, research for me was far into the realms of Doctoral students and the confines of microfilms in the libraries. As a worker, I was always involved in project management and the implementation of software systems. Always very hands on and practical work with little need to do or consume research.

Coming into the LDT program I had to decide between qualitative and quantitative research methods. My reasoning was that I’ve got some statistical background from Industrial Engineering and that I had virtually no contact with qualitative research. I have not regretted this decision and feel that the course has provided me with valuable skills for observing the world and for consuming and producing qualitative research. It has given me a whole new set of lenses to critique my own design and thought process.

Hopefully this paper will illustrate the main take aways from the course along with evidencing my learning process and methods. By no means I am intend to claim that this qualifies as true qualitative research as the process of data collection and analysis was not initiated as such – it was an afterthought that induces a top down approach to finding meaning. I came in with what I wanted to find in the data and found it. My personal bias is also exacerbated by the fact that I am a full participant-observer (Taylor & Bogdan, 1998). I tried to be as objective as possible and hopefully attended to at least some of the “Criteria for a Good Ethnography” (Spindler & Spindler, 1987, pp.18-21):

      1. Observations are contextualized: I attempted to describe my individual process in this paper yet leaving out the in-class description since the intended audience of this paper were part of this context.
      2. Hypothehis emerge in situ: the learning process and this paper shows evidently that I came in with no prior knowledge of the subject and came out with what I feel like a solid basis for future work.
      3. Observation is prolonged and repetitive: is a quarter long enough? Was I really observing repetitively my own actions? I could argue towards both ends of the spectrum where if I was not consciously observing myself with the purpose of this research paper, the observations were not made. On the other hand, my blog, assignments, and memory serve me with sufficient data for this analysis.
      4. Native view of reality in attended: well, I don’t think I can go more native that being the native myself.
      5. Elicit sociocultural knowledge in a systematic way: the process of maintaining every interaction with course documented in my blog could be considered a systematic approach to eliciting my sociocultural knowledge even though there is no record of sociocultural factors that might have affected my learning.
      6. Data collection must occur in situ:  in the sense that I am collecting data from myself, I would consider that all data collection was collected by me, for me, and within myself.
      7. Cultural variations are a natural human condition: I was unable to find throughout the process that my cultural background somehow affected my learning. Even though I am from Brazil, my education has been entirely   within the American and British systems, allowing me to feel ‘at home’ in this context and with the readings presented.
      8. Make what is implicit and tacit to informants explicit: hopefully I am able to layout implicit behaviors and communications patterns in this paper by detailing my thought process behind each claim.
      9. Interview questions must promote emic cultural knowledge in its most natural form: I used the questions presented in the description of this assignment as a guide during my self-mental-interview. I feel like they were sufficient to elicit what I have learned.
      10. Data collection devices: I used pencil, paper, camera, and the blog as devices to collect my data.

Surprisingly, according to this analysis above, this paper could very well be qualified as a qualitative research paper. As discussed in the last class of this course, there are several examples of alternative and artistic research such as poems, performances, novels, and documentaries. ‘The field allows it all’ (notes from week 10 class, 2015). All in all I felt that this was a valid approach to structure and present the data collected, even though the data collection itself was not originally intended for the purpose of this paper – but for the purpose of learning.

Methodology

Data Collection

The structure of the course involved a series of readings, mini-lectures, in-class group discussions, individual papers, and practice of qualitative research. The main topics covered were presented in a logical progression (Appendix A) that scaffolded our understanding towards the existing base knowledge about the field. A series of readings were assigned to support our in-class discussions and to present the current research and thinking about each topic. Written assignments were used to assess the class’ progression through the course. Finally, we conducted a short practice version of qualitative data collection and then ensued to analyze the data and present a mini-product.

My methodology for absorbing the content was primarily to be engaged with the content by attending all classes, reading and writing all that was assigned. While reading and during class I noted down important concepts that jumped out at me on paper. I was testing the notion that by going analogue and physically writing down my thoughts I might get the benefits of embodied learning: “The embodied interaction with things creates mechanisms for reasoning, imagination, “Aha!” insight, and abstraction. Cultural things provide the mediational means to domesticate the embodied imagination.” (Hutchins, 2006, p.8) These notes were then photographed and put in my blog (lucaslongo.com) for archival purposes.

Data Analysis

For this paper, I wallowed through the data – my notes – and interviewed myself mentally about the entire experience. I produced amended notes that summarized general knowledge pieces I have absorbed (Appendix B). These notes were initial guides as to the subject matter to be included in this paper. They also inspired me by presenting me with the opportunity to experience grounded-theory (Taylor & Bogdan, 1998) in the sense that writing this paper in a qualitative research paper was the best way to present what I have learned from this course.

In being a hyper-metacognitive participant observer in this research process, I will now present the main propositions from the readings and the practice research process.

Findings 

The assignment of conducting qualitative research was a crash course in the field. Even though highly structured and scaffolded by the educators, the process allowed for experiencing the multiple steps, processes and analysis required. The progression of observing, preparing interview questions, interviewing, making sense of the data, and finally writing it up felt like a genuine simulation of the real thing. 

In particular, we had very little time to come up with a context we wanted to observe and define a research question that interested us. For me that was and still seems to be the hardest step of research: what is an interesting question to ask? Is there a problem to be found? How much research has already been done in this area? Do I know enough about the context to be able to extract meaning from it? But I guess this key and the seed of all research, alluding to the “1% inspiration, 99% perspiration” mantra that echoes in my head from my undergraduate studies.

The observation and interview processes did not draw up many insights for me other than the interview questions preparation phase. I had never structured an interview before and found that the strategies discussed in class and in the readings were extremely helpful for understanding how to better extract information from the informants. Probing and markers were the concepts that most stood out for me as techniques that I will take with me.

The process of analyzing the data and writing up the product showed me how much data was collected from a simple one hour observation and two hour long interviews. I was also surprise as to how much meaning can be extracted from micro-analyzing what was said by the informant. Not to mention the fact that our final conclusion or theory, truly emerged from the data. My group was worried that as much as we discussed, we did not feel like we had anything interesting to say about our context. At the last moment, when arranging the propositions, a general cohesive thought emerged from them, allowing to generate a conclusion that was both backed by evidence and that had meaning for us. I was initially skeptical about the method of coding exhaustively the data yet I was completely debunked in my convictions having experienced it first hand.

Finally, one framework that I found very helpful in the process was Petr’s diagram (Appendix C) that made the process somehow tangible in my mind. It is a great representation of grounded theory and the qualitative research process. Obviously this diagram was backed up by our readings and fruitful class discussions, without which it would not have had such an impact on me. It especially helped my in thinking about and creating propositions, the claims that we could back up with evidence all the way to the future implications of our findings: “Turtles all the way down”.

Conclusion

Throughout this paper, I attempted to summarize the learning process I went through and what I learned from this course. It has further consolidated my learning about qualitative research, validated some of my learning methods, and made me aware of all the pedagogical techniques designed into the course itself. Considering I would not have been able to engage in a meaningful conversation with other qualitative researches prior to this course, I consider this experience a success in learning (and being taught) about the field. Thank you.

Looking ahead I see room for improvement in my writing skills especially in citing previous research. This ties into to my technique of reading and note taking. I look back at the readings and find no highlights of meaningful phrases. My notes as a photograph on the blog are not searchable. Because of this I had to go back into the readings again to pull out citations. I had to try to understand my sometimes messy handwriting and make sense of it. With this in mind I am abandoning hand written notes in favor of going straight to digital.

I also feel that I have to work on my own master’s project research question and start to plan out my research. I feel that this class gave me significant skills, techniques, and concepts to be able to do so. My entrepreneur traits have a tendency to look for a solution with a top down approach. Now I have grounded theory to reduce my anxiety of getting to ‘The’ solution – I see that I must dive into context I want to meddle in, observe it exhaustively, understand how the natives navigate, analyze and then finally be better equipped to propose, claim and who knows solve a problem.

Appendix A – Course Progression

Concepts

  1. The Nature of Qualitative Research
  2. Qualitative Methods — Why and When
  3. Data Collection: Observation
  4. Data Collection: Interviewing
  5. Examining Subjectivity
  6. Analysis: Making Sense of the Data
  7. Considering Validity and Rigor
  8. Ethical Issues

Readings

See Reference section below

Assignments

  1. RDR #1: The Observation Process
  2. Qualitative Research Critique
  3. RDR #2: The Interview Process
  4. Draft of “mini-products” 
  5. Qualitative Product Paper
  6. Qualitative Process Paper (this paper)

Appendix B – Amended Notes

Notes I generated in preparation for this paper:

Paper’s Structure:  

Act 1
Tell story from Week 1 – Week 10
Novelty of the subject

Act 2
Readings
Observation/Interview
Data analysis

Act 3
Main takeaways
Strengths and weaknesses
Room for improvement

Wallowing through blog notes:

Main take aways from the class: 

  • Qualitative research – or research itself.
  • The power of writing
  • Frameworks and concepts
    • Turtles all the way down
    • I as a camera
    • Grounded theory
    • Probing
  • Criteria for good ethnography
  • Participant observer – cool! Almost like spy work
  • Finding a research problem – that’s the hardest part I think
  • Interview preparation
  • Interview behavior
  • Coding – did not believe in it at first
  • Propositions – Petr’s diagram
  • Validity – just be clear how you wrote it – Geisha
  • Learning acquires you – Legitimate Peripheral Participation

 Pushing myself

  • Improve on writing skills
  • Read and read and read more research
  • Identify my own research problem
  • Tension between researching and creating solutions
  • Stand on giant’s shoulder and do something?
  • Become a giant for others to be able to do something?
  • Interview process I think I’d do well
  • Need to practice more in extracting meaning from data, not so instinctive for me – never has been – I take facts for face value – maybe a good quality for less-biased field data collection and data analysis.

Appendix C – Petr’s Research Diagram

Screen Shot 2015-12-08 at 12.11.49 PM.png

References

Note: references are not in alphabetical order to preserve chronological sequence

Reading Assignments:

The Nature of Qualitative Research

Merriam, S. (2002). Introduction to Qualitative Research. In S. Merriam & Associates (Eds.) Qualitative Research in Practice. San Francisco: Jossey-Bass. pp. 3-17.

Miles, M.B., & Huberman, A.M. (1994). Qualitative Data Analysis: An Expanded Sourcebook. (Second Edition). Thousand Oaks, CA: Sage. pp. 1-12.

Spindler, G. & Spindler, L. (1987). Teaching and Learning How to Do the Ethnography of Education. In G. Spindler & L. Spindler (Eds.) Interpretive Ethnography of Education at Home and Abroad. Hillsdale, NJ: Lawrence Erlbaum Associates. pp. 17-22.

Creswell, J. (2003). “A Framework for Design,” Research design: Qualitative, Quantitative and Mixed Methods Approaches (2nd edition). Thousand Oaks, CA: Sage. pp. 3 -24.

Becker, H. (1996). The Epistemology of Qualitative Research. In R, Jessor, A. Colby, & R. Shweder (Eds.) Ethnography and Human Development. Chicago: University of Chicago. pp. 53-71. (link)

Geertz, C. (1973). “Thick Description, “The Interpretation of Cultures. New York: Basic Books. pp.3-30. (link)

Data Collection

Taylor, S., & Bogdan, R. (1998). “Participant Observation, In the Field,” Introduction to Qualitative Research Methods. (Third Edition). New York: John Wiley & Sons. pp. 45-53, 61-71.

Glesne, C., & Peshkin, A. (1992). “Making Words Fly,” Becoming Qualitative Researchers: An Introduction. White Plains, NY: Longman. pp. 63-92.

Weiss, R. (1994). “Interviewing,” Learning from Strangers: The Art and Method of Qualitative Interview Studies. NY: Free Press. pp. 61-83, 107 – 115.

Subjectivity

Peshkin, A. (1991). “Appendix: In Search of Subjectivity — One’s Own,” The Color of Strangers, The Color of Friends. Chicago: University of Chicago. pp 285-295.  

Peshkin, A. (2000). The Nature of Interpretation in Qualitative Research. Educational Researcher 29(9), pp. 5-9. (link)

Analysis

Taylor, S., & Bogdan, R. (1998). “Working With Data,” Introduction to Qualitative Research Methods. (Third Edition). New York: John Wiley & Sons. pp. 134-160.

Charmaz, K. (1983). “The Grounded Theory Method: An Explication and Interpretation,” In R.

Emerson (Ed.) Contemporary Field Research: A Collection of Readings. Boston: Little, Brown. pp. 109-126.

Graue, M. E., & Walsh, D. (1998). Studying Children in Context: Theories, Method, and Ethics. Thousand Oaks: Sage. pp. 158-191 and 201-206.

Page, R., Samson, Y., and Crockett, M. (1998). Reporting Ethnography to informants. Harvard Educational Review, 68 (3), 299-332.

Emerson, R., Fretz, R., & Shaw, L. (1995). “Processing Field Notes: Coding and Memoing,” Writing Ethnographic Field Notes. pp. 142 – 168.

Validity and Rigor

Johnson, R. (1997). Examining the Validity Structure of Qualitative Research. Education, 118, pp. 282-292.

Wolcott, H. (1990). On Seeking –and Rejecting– Validity in Qualitative Research. In E. Eisner & A. Peshkin (Eds.) Qualitative Inquiry in Education: The Continuing Debate. New York: Teachers College. pp. 121-152.

AERA (2006). Standards for Reporting on Empirical Social Science Research in AERA Publications. Educational Researcher 35(6), pp. 33-40.

Anfara, Jr., V., Brown, K, & Mangione, T. (2002). Qualitative Analysis on Stage: Making the Research Process More Public. Educational Researcher 31(7), pp. 28-38. (link)

Ethics

Altork, K. (1998). You Never Know When You Might Want to Be a Redhead in Belize. In K. deMarrais (Ed.) Inside Stories: Qualitative Research Reflections. Mahwah, NJ: Lawrence Erlbaum. pp. 111-125.

 Lincoln, Y. (2000). Narrative Authority vs. Perjured Testimony: Courage, Vulnerability and Truth. Qualitative Studies in Education 13(2), pp. 131-138.

Products of Qualitative Research

Cohen, D. (1990). A Revolution in One Classroom: The Case of Mrs. Oublier. Educational Evaluation and Policy Analysis 12(3), pp. 311-329. (link)

McDermott, R. (1993). Acquisition of a Child by a Learning Disability. In S. Chaiklin & J. Lave (Eds.) Understanding Practice. Cambridge: Cambridge University. pp. 269-305. (link)

Rosenbloom, S., & Way, N. (2004). Experiences of Discrimination among African American, Asian American, and Latino Adolescents in an Urban High School. Youth and Society 35(4), pp. 420- 451. (link)

Other readings:

Edwin Hutchins (2006). Learning to navigate. In S. Chaiklin & J. Lave. (Eds.). Understanding practice: Perspectives on activity and context, pp. 35-63. New York: Cambridge University Press.

Qualitative Research – Final Product

Great working with James and Ana in this project!

Text version below and nicely formatted version here: Final Product.pdf

ABSTRACT

In this qualitative study, individuals involved with the Learning Innovation Hub (iHub) were studied to address the research question, “How does iHub facilitate collaboration between educators and entrepreneurs to promote education technology innovation and adoption?” To this end, an observation of the iHub fall 2015 orientation and two interviews with iHub Manager Anita Lin were conducted over the course of three weeks. iHub was found to facilitate collaboration between teachers and startups by seeing teachers as key agents in edtech adoption and focusing on teacher needs. iHub, in turn, does not focus on other stakeholders in the education ecosystem beyond teachers. This raises concerns about iHub’s impact on outcomes for learners.

(Keywords: education technology; edtech innovation; edtech adoption; iHub)

1 INTRODUCTION

Technology has the potential to revolutionize the ways in which we teach and learn. In recent years, a surge of education technologies has pushed more products into the hands of educators and learners than ever before. In fact, investments in edtech companies, too, have skyrocketed; during just the first half of 2015, investments totaled more than $2.5 billion, markedly surpassing the $2.4 billion and $600 million invested in 2014 and 2010, respectively (Appendix A) (Adkins, 2015, p. 4). In the 2012-13 academic year, the edtech market represented a share of $8.38 billion, up from $7.9 billion the previous year (Richards and Stebbins, 2015, p. 7). But how do educators find the education technologies that actually improve learning outcomes in a space increasingly crowded with many players and products?

The Learning Innovation Hub (iHub) is a San-Jose-based initiative of the Silicon Valley Education Foundation (SVEF) in partnership with NewSchools Venture Fund. Funded by the Bill & Melinda Gates Foundation, iHub aims to provide an avenue “where teachers and entrepreneurs meet.” iHub seeks to develop an “effective method for testing and iterating the education community’s most promising technology tools.” (iHub website).

To this end, iHub coordinates pilot programs of edtech products in real school settings. The iHub model involves:

(1) recruiting early-stage edtech startups with in-classroom products to apply to the program,

(2) inviting shortlisted companies to pitch before a panel of judges,

(3) selecting participating startups,

(4) matching startups with a group of about four educators who will deploy products in their classrooms,

(5) jointly orienting educators and entrepreneurs prior to the adoption of the technology in the classroom, and

(6) guiding communication among participants throughout the pilot and feedback phase.

iHub plays a unique role in the edtech ecosystem of Silicon Valley given its position as a not-for-profit program that does not have a financial stake in the startups. As such, we are interested in better understanding iHub’s impact on improving learning outcomes through technology. This study seeks to address the following research question:

How does iHub facilitate collaboration between educators and entrepreneurs to promote education technology innovation and adoption?

2 METHODOLOGY

We followed a prescribed sequence from framing our research question through data collection and analysis. Although we did not conduct a formal literature review on the research topic, members of the research team began the project with prior experience of education technology use and adoption in the classroom. We also conducted an informal observation of the organization prior to the official start of the project; we attended the iHub Pitch Games, during which the startups were selected for the participation in the fourth cohort. Our subject was selected based on a combination of convenience sampling and alignment of interest in the subject within our team.

2.1 Data Collection

We used three primary sources of data collection: online documentation, an observation, and interviews. This source triangulation roots the reliability of our findings and affords us various insights into the native view in order to understand iHub’s strategies for facilitating collaboration between educators and entrepreneurs.

We began our official data collection through the iHub website, which lays out the overarching priorities of the iHub program. The website afforded us a preliminary understanding what the program does, which we continued to access throughout the duration of the study. With this written information, we were able to compare what the program claims to do to what the program actually does, as demonstrated in the observation and what the program says it does, as elucidated by the interviews.

We continued our data collection by conducting a one-hour observation of iHub’s fall orientation (Appendix B). The orientation represents the first in-person point of contact between participating educators and entrepreneurs of the fourth iHub cohort. Coordinated by SVEF staff and spearheaded by Lin, it served as the ideal occasion for observation, as it showcased iHub’s role as a facilitator of communication and collaboration between educators and entrepreneurs. Uniting everyone together in the same room, the orientation dealt with everything from high-level discussions of the goals of iHub down to the administrative details of the initiative. Both raw and amended notes were kept by all three researchers.

        In the two weeks following the orientation observation, we conducted two one-hour-long interviews of Lin. Lin was selected as the ideal interview subject given her accessibility as gatekeeper to the research team, her position as iHub Manager, and her deep understanding of the iHub initiative. A peer-reviewed interview guide was used in both interviews, though interviewers let questions emerge in situ as appropriate. The first interview sought to garner an understanding of the overarching goals and priorities of both SVEF as the parent organization and iHub as the specific program of interest. We explored what the organization does, what their processes look like, and Lin’s role within iHub. While the interview uncovered some of the areas for deeper discussion, we were intentional in keeping inquiries of the first interview at an introductory level and saving probes for the second interview. The second interview, in turn, honed in on a more granular discussion of iHub’s role in the technology adoption process and learning outcomes. Both interviews were voice-recorded, and approximately thirty minutes of each interview were transcribed (Appendix C).

2.2 Data Analysis

Our data analysis process went hand-in-hand with our data collection process, allowing us to make adjustments of our concepts, themes, and categories throughout our research.  While we did not create memos per se, individual research descriptions and reflections served to clarify and elucidate some of the themes and insights that emerged throughout the process. Raw and amended notes and interview transcriptions were coded with the following jointly designed list of codes:

  •   Educator feedback
  •   Entrepreneur feedback
  •   Examples of success
  •   Examples of challenges
  •   Focus on early-stage startups
  •   Focus on educators
  •   Funding partnerships
  •   Metrics for success
  •   Neutrality
  •   Opportunities for improvement
  •   Organizational design
  •   Stakeholder alignment
  •   Tension between decision makers
  •   The iHub model/framework

From there, we were able to identify themes and patterns in our data. As our research question seeks to understand a phenomenon, a grounded theory approach proved most appropriate. This grounded theory method, we derived the propositions described below.

3 FINDINGS

3.1 iHub sees teachers as key agents in edtech adoption.

        iHub sees teachers as key agents in edtech adoption. While the organization understands that entrepreneurs, school principals, district managers, and policy makers are all stakeholders in this process, they view teachers are the strongest drivers:

What we have heard from teachers and from districts, is that a lot of times for a school for adopt or…use a product across their school, it’s because a group of teachers have started of saying “I’ve been using this product. I really like this product. Hey, like friend over there! Please use this product with me,” and they are like, “Oh! Yeah we like it,” and kind of builds momentum that way (Interview Transcripts, 2015).

Under this notion that teachers can be strong advocates of edtech products, iHub is looking to adjust its curriculum around teachers as the key agents:

“So we kind of have been thinking about how do we build capacity of teachers to advocate for products they think are working well” (Interview Transcripts, 2015).

They also initiate their pilot cycles with the teachers defining what their current needs are, prior to selecting the entrepreneurs that are going to participate:

“So we send out to our teachers and they’ve kind of, I would say vaguely, have defined the problem of need, and we’d like to kind of like focus them on the future.”

Innovation then, is driven by what the teachers need in the classroom. These teachers are hand picked based on their proficiency in adopting technology and likelihood of giving better feedback and needs statements:

I think teachers who we pick, we try to pick ones who are very…very experienced with using tech in the classroom and so I think that you, you find that teachers who use tech in the classroom, you…it’s like their instruction is different (Interview Transcripts, 2015).

Going further, iHub wants to promote a community of practice to enable discussion and scaffolding amongst teachers open to edtech adoption:

And so I think our program is also to help teachers who are early adopters of technology, help them kind of meet other teachers at different school for early adopters, and build a cohort that understands that and kind of can refer to each other (Interview Transcripts, 2015).

Finally, when looking at entrepreneurs, iHub sees the teachers as the key agents to their success:

So I think that that’s why we’re working with early stage companies because I think it’s, it’s possible to find one now that meets the needs of many teachers and kind of help it kind of just move along (Interview Transcripts, 2015).

3.2 iHub has a focus on teacher needs.

iHub’s belief that teachers are key agents in edtech adoption leads it to focus on teacher needs. Many of iHub’s processes and resources revolve around satisfying the needs and constraints the teachers might have in the process. For instance, the orientation – albeit an event bringing together all participating stakeholders – was framed around the needs of teachers. Rhetoric revolved around “how do we choose the technology we use in the classroom?” (Observation Amended Notes, 2015).

The focus on teacher needs also became evident during our interviews with Lin.  In fact, the program’s existence is rooted in the perceived needs of teachers:

Schools DO need edtech products…They want products that do x, y, z. But they don’t really know how to go about and find them. So I think that that’s why we’re working with early stage companies because I think it’s, it’s possible to find one now that meets the needs of many teachers and help it just move along (Interview Transcripts, 2015)

Lin explicitly described one of iHub’s main goals as:

To help teachers who are early adopters of technology, help them kind of meet other teachers at different school for early adopters, and build a cohort that understands that and kind of can refer to each other…we kind of want to help teachers understand how to use it—edtech—in their classroom (Interview Transcripts, 2015).

Furthermore, Lin acknowledged that iHub gives teachers various opportunities to communicate their needs:

There are lots. Every time we have meetings, we are very…open about that. And I also think, we have surveys, so there’s a lot of, we send out a lot of surveys about a lot of…different, specific, different…happenings, and so after…orientation happened, there was a survey that was sent out about that. After the Pitch Games happened, there was a survey about that…I also think that during the rounds if we have strong relationships with teachers, which is typically the case, then teachers are very open with us. At the end, I’ll be like, you know, we’d love to hear your feedback, and they’ll just tell us, you know, we’d love it if there was this, this, this, this (Interview Transcription, 2015).

Lin also shared two instances in which feedback from teachers was implemented to improve this focus on teacher needs. In the first, Lin pointed out:

We’ve made a lot of those changes based on teacher feedback.  Like for example, the reason why teacher teams are at a school site this year instead of from all different schools is partly because it makes sense, I think, to scale, but also because that was one of the big pieces of feedback that was given from the beginning (Interview Transcripts, 2015).

In the second, Lin highlighted a major change in the mechanics of the pilot program where instead of having one teacher per school, they are now working with teams of teachers from each school. This change was based on teacher feedback that wanted more collaboration amongst themselves:

Yeah, so I think for our teachers we would like them to meet up kind of weekly. And when you’re not at the same school it’s a lot harder to meet on a weekly because maybe one night one school has their staff meeting and then the other night the other school has their staff meeting and then, you know, I think it was a lot of commitment to ask and I think a lot of teams found it really challenging and maybe would not always be there because of that (Interview Transcripts, 2015)

Additionally, in her description of the structure of the program, Lin conceded that the number of companies accepted into the program is contingent on the availability of teachers: “For us its capacity of teachers…In our last round we supported 25 teachers. And this round we have about 13 teams of teachers” (Interview Transcripts, 2015). In fact, the single metric for success of the iHub program that Lin identified when prompted was the number of teachers using the iHub model (Interview Transcripts, 2015).

In the discussion on iHub’s priorities, Lin also focused on outcomes for teachers.

3.3 iHub does not focus on the needs of other stakeholders beyond teachers.

Noticeably, we found no evidence that iHub is focusing on the needs of any other stakeholder group beyond teachers. At the foundation of this proposition lies the evidence supporting the previous proposition that iHub is focused on teacher needs; in other words, if iHub is focused on teacher needs, it is by default not focused on the needs of other stakeholders. The following evidence further proves that iHub does not focus on the needs of other stakeholders.

When asked directly what the ideal relationship between schools and startups would be, Lin responded, “An edtech vendor is a provider, right? So they should be providing some service that fits a need that a school has or a teacher has or a student has in some way” (Interview Transcripts, 2015). She added, “They’re still…an early stage company so they’re…still growing and figuring out exactly what it looks like.” (Interview Transcripts, 2015).

When asked about the school’s reciprocal responsibility to startups, Lin responded, “I don’t know, I never thought about that, it, as much that way” (Interview Transcripts, 2015). iHub does not have any expectations for what a teacher should contribute to the relationship.

iHub acknowledges that communication problems have arisen when the startup’s focus is diverted from the pilot program, “And so they became pretty unresponsive with our teachers. The teachers like, emailing me, and I’m like trying to get in contact with it.” However, iHub does not have a process for holding startups accountable, “And so typically when there’s not communication between these parties…the pilot would not be as successful as it could because they weren’t communicating” (Interview Transcripts, 2015)” iHub does not have the relationship with startups to communicate in order to address an issue such as this.

Lin even identified for improvement in iHub’s relationship to stakeholders such as entrepreneurs and school districts:

And so how do we support districts where maybe they’re not as on top of edtech, how do we support their administration so that they understand the role of it, understand maybe how select it, and understand how teachers use it so they can provide the support both maybe in resources but also in professional development to their teachers (Interview Transcripts, 2015).

But on a broader scale, iHub demonstrated shallow knowledge of the intricacies of the larger education system:

I would say I don’t know enough about school districts and about school…counties, offices, to be able to know whether or not they’re functional. There’s a lot of bureaucracy, I think, that comes up when you work with the county and work with…there’s so many different needs and so many different people kind of working on it that sometimes…they can’t, they’re unable to kind of do certain actions because of different reasons, whatever they are. So I don’t know (Interview Transcripts, 2015).

And finally – there is rare evidence of any kind of direct preoccupation about learning objectives and student satisfaction. The few moments that students were mentioned follow:

I talked about how it helps students but really a big point, I think a big selling point for districts is that it helps teachers, we give a lot of teacher professional development during that time. (Interview Transcripts, 2015)

Yeah, so I think we just want to make sure the, the products are really relevant to students, right. And so, that’s the way we do it, is that you get feedback from students and teachers, but I think those needs change, right? Each year, this year, the needs are different than last year, because this year you’re using Common Core, and last year, maybe, it wasn’t as big of a, actually it was really big last year. (Interview Transcripts, 2015)

Only when prompted for a specific success story did Lin share the story of one student with some light motor disabilities who learned from the entrepreneur’s product:

I think also a lot of the other teachers who worked with that product really, their students really enjoyed it, because it is really engaging and they were making like, connections between the fact that, you know, I’m doing math… And I think that was a like a really wonderful example of a product that went really well (Interview Transcripts, 2015).

 

4. LIMITATIONS

In order to have had a better sense of what edtech adoption really looks like within iHub’s process, we would have to have done several further observations. We were able to see the process in its very early stages but feel that observing the classroom setting and feedback meetings would reveal more interesting data.

What we observed was also not as relevant to the overall research question as were the interviews. What we saw was the very first meeting between teachers and startups. We observed the initial questions and doubts about the use of the product, yet no teacher had played with it yet. It would have been interesting to observe the product already running in the classroom.

        Our limited previous knowledge of what the company did, what we were observing and their overall goal also reduced our capacity to probe deeper into the subject matter. We might have chosen to interview a teacher or a startup instead of the manager of the program, for instance.

5 CONCLUSION

iHub views teachers as key players to edtech adoption given their position to advocate for technology products among their colleagues and other stakeholders within the system, and this view of  teachers has led them to focus on teacher needs. Our evidence demonstrates that iHub’s goals, its active pursuit of teacher feedback, the changes implemented within the program, and the program’s metrics for success all point to this focus on teacher needs. Insofar as iHub is focusing on teachers’ needs, it can no longer prioritize the other stakeholders in the education ecosystem. In fact, our evidence shows that this has led to a weak relationship with other stakeholders including entrepreneurs, districts and learners.

5.1 Implications

Our conclusion raises concerns with the ethics of iHub’s facilitation of edtech adoption. In optimizing to meet the needs of teachers, iHub is not focused on optimizing for learner outcomes. In our evidence, there is less priority given to students, learning objectives, and teaching pedagogy. iHub is trusting that teachers operate with the best interests of their learners in mind, but we cannot be certain that this is always the case. We do not actually know the real implications of the iHub program on learning outcomes, but no research is being done to understand whether there is benefit or detriment to students.

REFERENCES

Adkins, S. (2015). “Q1-Q3 International Learning Technology Investment Patterns,” Ambient Insights. http://www.ambientinsight.com/Resources/Documents/ AmbientInsight_H1_2015_Global_Learning_Technology_Investment_Patterns.pdf

Learning Innovation Hub website. Retrieved on 2015, December 7 from http://www.learninginnovationhub.com/

Richards, J. & Stebbins, L. (2015). “Behind the Data: Online Course Market–A PreK-12 U.S. Education Technology Market Report 2015,” Education Technology Industry Network of the Software & Information Industry Association. Washington DC: Software & Information Industry Association

APPENDIX A: Investments Chart

Source: Adkins, S. (2015). “Q1-Q3 International Learning Technology Investment Patterns,” Ambient Insights. http://www.ambientinsight.com/Resources/Documents/AmbientInsight_H1_2015_Global_Learning_Technology_Investment_Patterns.pdf

APPENDIX B: Orientation Observation

Screen Shot 2015-12-07 at 6.25.16 PMScreen Shot 2015-12-07 at 6.25.26 PM

APPENDIX C: Interview Transcripts

Interviewee: Anita Lin, ‎iHub Manager at Silicon Valley Education Foundation (SVEF)

Interview 1: Oct 29 2015

Audio file: download

Interview 2: Nov 5, 2015

Audio file: download

13:49:00

[Ana] What would you say, um, the goals specifically of innovation is, the innovation group within SVEF is?

[Anita] i think the goal is to find…find innovative things that are happening in education and help support their growth. That’s what I think the innovation side is focused on.

[Ana] Ok. So you pointed to the three different stakeholder groups that are kind of important as you’re going towards your mission, which are students, teachers, and, um, districts [yeah], um. But you said that they’re not always aligned. And so how does, in the work that you’re doing bringing the different stakeholders, and adding even a fourth stakeholder to that, how do you try to align those different groups?

14:53:01

[Anita] Yeah, that’s a good question…We, when I think about it, what I mean is mostly that when you find ed tech companies and you recruit them, some ed tech companies are focused on students and the classroom experience; some ed tech companies are focused on the school experience, or like maybe making life easier for teachers, which I would consider different than a product that…instructs students for math; and then some products, right, are learning management systems, and those are for your district. You want to be able to use them across the district so that all the information is centralized. And so those, so when, so depending on the person who’s looking at a product and their position in that whole spectrum – a student, a teacher, a principal even, an instructional coach, or like someone in the district – the way they look at a product is different. That’s what I meant by that. [mhm] So, does that answer your question=

[Ana] Yeah, yeah, yeah absolutely. And how, could you speak to the challenges of, like, actually aligning [yeah] those groups.

[Anita] Yeah. I think that’s always a tension that happens in education, not just…within our work, but as a whole…sometimes, for example, here’s an example that I ran into last weekend. Someone was telling me, they worked with one of our companies previously. But…what happens in their district is very, I think they’re very on top of the policies basically, and so they approve certain, certain companies for use in the classroom because they meet all the privacy laws and all those…all these requirements they set, so I think privacy laws and more. And so because for some reason the district didn’t approve this one product she had been using before, and so this year she can’t use it. And so, right, to her, she, the way this teacher sees it is like, well, like my students really want to do it, I really want to do it, why can’t I just do this? But then the district sees it as like, you know, we have a process. This didn’t fit our criteria for some reason or the other. And so therefore, we don’t allow it. Right? And so then there’s that tension, and I think we’re still figuring out how you solve that. [yeah] I think it’s, I think it’s a tension for anything, ’cause even curriculum, that happens in curriculum…textbook curriculum adoption. So, [yeah] yeah.

17:17:05

[Ana] So do you see a role for SVEF, in that specific situation, to facilitate [yeah] alignment?

[Anita] So we kind of have been thinking about how do we build capacity of teachers to advocate for products they think are working well. We also have been thinking about how we support districts in understanding ed tech. So if a district, right, we know that some districts are really on top of ed tech in Silicon Valley and some are not. [mhm] And so how do we support districts where maybe they’re not as on top of ed tech, how do we support their administration so that they understand the role of it, understand maybe how select it, and understand how teachers use it so they can provide the support both maybe in resources but also in professional development to their teachers.

[Ana] Can you give a specific example?

[Anita] Yeah, so we haven’t done this yet exactly, which is why maybe I don’t have a great example, but in the spring, we’re thinking about how do we build capacity of the district. And so we are thinking about convening some…instructional tech directors in a meeting and having them kind of talk about challenges they faced or things they’ve done really well in implementing education technology in the classroom. And so some work that supports this (inaud), which I mentioned earlier, is that we do these ed tech assessments where we go to different school districts and…walk them through an ed tech assessment from hardware all the way to software. So do you have enough access points? To do you provide training for your teachers when you do Google Classroom or whatever product they’re using. So we kind of want to use that to support our teachers.

18:55:02

[Ana] Yeah. Do you have a, (three-second pause) I guess, (three-second pause) where would you be, what point in the process are you in this now, if you’re thinking about it for the spring?

[Anita] So we have been, that’s a good question, so we’ve, we’ve done a couple ed tech assessments in the area that we’re kind of targeting right now so the East Side Alliance area…and we are targeting the last week of July as like, sorry not the last week of July, January, as…this director get-together…so I think we’ll kind of get an aggregate report from that data and then run some sort of roundtable with these directors. So that’s kind of what we, we have the idea, we kind of have an idea of when it would be. We are working with Price Waterhouse Coopers, PWC, with, to implement this work. And so they’re creating a project plan currently. And so then we’ll kind of partner with them to execute on that.

20:03:00

 

26:40:00

 

[Lucas] You’re good?

 

[Anita] umhum

 

[Lucas] Alright so… hum… you guys good… hum… so… I think, hum… we’re gonna dive into a little bit more about the model you mentioned

 

[Anita] Ok

 

[Lucas] So if you could tell me in your own words what’s the process that the startups go through with iHub prior to Pitch night?

 

[Anita] So we’re recruiting startups that are early stage so, what I would say we define that between Seed and Series A, hum… but I think it’s probably more broadly interpreted than that and so… We kind of reach out to contacts we have in the Bay Area and maybe a bit nationally and ask them to help us pass on the message that we are kind of looking for Ed Tech companies that are, that could be used in some classrooms specifically.

 

[Lucas] Ok

 

[Anita] From there companies apply online through a, like, a Google Forms. It’s pretty simple. It’s a very short application process, but I do think we’ll probably add to that next year. hum. And then following a certain time period I convene the invites of different people to be part of a short list committee. And so our short list committee consists of venture capitalists, it consists of accelerator partners and then also people from the education community so that typically is maybe a like an Edtech coach of the school or an IT Director at a school. Hum. Potentially some educators as well. So we kind of bring together this, this committee that, from all of our applications we f it down to 12. Then we ask those 12 to pitch other pitch game and we kind of ask them “Hey, focus on things you used in STEM classrooms” and we, we invite judges that are business judges. So typically CEOs of big companies in the area and then also education leaders. so we had [name of person] one. There’s also, hum… we’ve also had people who (whispers) trying to think of who else… (normal) we’ve had educators, we’ve also had IT Directors as well ‘cause we kind of think, you know, they’re different slices of the education world so we have both of those be there and then they pitch and then the judges ultimately select the pool of companies that we work with for the round.

 

[Lucas] So you mentioned there’s 12 applicants… 12 selected [uhum] and then how many go to hum, the actual orientation?

 

[Anita] We pick between 6 to 8 companies [6 to 8] This last round we picked 6 hum, I think we have 11 pitches so that’s probably what we have.

 

[Lucas] Uhum… Is there any, hum… reason for this number?

 

[Anita] For us its capacity of teachers [capacity of teachers] so we support, in our last round we supported 25 teachers. And this round we have ’bout 13 teams of teachers. And so we kind of didn’t wanna companies to support more than 2 or 3 although I think… we… we… we didn’t wanna it to be super challenging for companies to support and also since they are early stage products, we found that some companies as they’re taking off, like, they get really busy and they’re like, completely immersive so… I think it’s to balance both the support aspect but as well is kind of the teachers that we can support also.

 

[Lucas] Uhum… so let’s go a little bit back, uh, what happens between the pitch night and orientation in terms of your interactions?

 

[Anita] So we send out to our teachers and they’ve kind of, I would say vaguely, have defined the problem of need, and we’d like to kind of like focus them on the future. Make ’em define it a lot more clearly. Hum… but have… we send out… I send out a form that kind of says, you know, “You’ve seen these companies at the pitch round. Here’s some more information about them if you’d like”. And I ask them to preference these different companies. So like, 1, 2, 3, 4 I mean we have them preference them whether or not, they’re going to work with all of them. And so, we… then they preference them and I kind of match them typically if I can I just give them their first choice of company that they’d like to work with cause I think that (mumbles), builds a lot of  investment in our process, hum… and then by orientation they know who they are working with and then we kind of tell them that all of… We’ve already told them all the program requirements before but we kind of go over them at orientation and then go over… let them meet their companies for the first time.

 

[Lucas] Great. And how about after orientation? What happens?

 

[Anita] So after the orientation we kind of let them go and set up their products for about a week or two depending on the time crunch we have from the end of the year and then… for the next couple of weeks they use the products in the classroom. There might be some observations but I would say these observations are mostly from a project management perspective more than like, an evaluative one. And then they submit feedback. And so we have some templates that we give them that we ask them to submit feedback from. There’s probably have a guiding question for each one and each week we’ll update that guiding question. Also we use a platform called Learn Trials which kind of gets qualitative feedback generally from these teachers about the product and includes comments but also has a rubric that they kind of use. And we’ve asked for pre and post assessments in the past that our teacher created ahm, but this probably hasn’t been… we have not been doing that. I think we need to find a better way to incorporate, so…

 

[Lucas] So, so… tell me a little bit more about this tool for the Qualitative Assessment. You said the name was?

 

[Anita] Learn Trials – and so they have a rubric that assesses an ed tech company across different strands whether that’s usage, whether that’s how easy was it for it to set up. And they kind of just rate them almost like grading it you know, like give it an A, give it a B. So like kind of like over time. And we ask them to submit it in different, like different… on a routine, so every 2 weeks or something. Where you’re able to kind of see how the product performs over time.

 

[Lucas] And am I correct to assume that after orientation the process goes towards, until the end of the semester?

 

[Anita] Yes – so it’s only until the end of this semester. So typically December, I want to say like  18th

 

[Lucas] And then what happens?

 

[Anita] And then at the end of this orientation we SVF maybe with the help of some of our partners like LearnTrials will aggregate some of this data and will share that out with the community. Additionally for this round something we’d like to do is maybe then from our 6 companies that we work with, work with a few of them and help them… help support implementation in the school versus just a couple classrooms that a school. So we’re still figuring this spring what that exactly looks like in terms of the implementation of the, these products but that’s something that we’d like to do.

 

[Lucas] And when you say community you mean both teachers, schools and the EdTech companies? You share it with everyone?

 

[Anita] Yeah

 

[Lucas] Hum… so what other events or resources you provide that have like similar goals or priorities? Or is this the only…

 

[Anita] Like within our organization? [yes] Well, in terms of teachers support, like, our Elevate Program I know… I talked about how it helps students but really a big point, I think a big selling point for districts is that it helps teachers, we give a lot of teacher professional development during that time. And so I think our program is also to help teachers who are early adopters of technology, help them kind of meet other teachers at different school for early adopters, and build a cohort that understands that and kind of can refer to each other. Humm… so we also do some, some I would say professional development is not as extensive as all of it is but we kind of want to help teachers understand how to use it, EdTech in their classroom. Potentially, referencing… Our goal is to reference the SAMR model. So..

 

[Lucas] Uhum… And is this whole process the first cycle you guys are going through, or you have been doing this for a while?

 

[Anita] So we started our first round in the Spring of 2014 – so this is technically round 4 but we’ve itter… like… it changes… little pieces of it change each round. So in the past when we’ve done it, when I run it, it was just I would recruit individual teachers from schools and so then I would form them onto a team so maybe a school, a teacher from school A, a teacher from school B, and a teacher from school C. And in this round I re…, we did recruitment where I recruited teacher teams. So now it’s like 3 teachers from school A, 3 teachers from school B, and then they are all using the same product at their school site so I think that helps with the piece of collaboration that was harder to come by earlier.

 

[Lucas] And how was it harder?

 

[Anita] Yeah, so I think for our teachers we would like them to meet up kind of weekly. And when you’re not at the same school it’s a lot harder to meet on a weekly because maybe one night one school has their staff meeting and then the other night the other school has their staff meeting and then, you know, I think it was a lot of commitment to ask and I think a lot of teams found it really challenging and maybe would not always be there because of that. Hum… So… that was a big shift from that. But I think it really builds a community within their school. And I think, what we have heard from teachers and from districts, is that a lot of times for a school for adopt or you know, use a product across their school, its because a group of teachers have started of saying “I’ve been using this product. I really like this product. Hey, like friend over there! Please use this product with me,” and they are like, “Oh! Yeah we like it” and kind of builds momentum that way [uhum]

 

[Lucas] So yes, so I guess that talks to the implementation phase of the, of the software that they were trialling. Hum… could you tell of us of a, a specific hum, aaaaa, ww… what do you call this phase after orientation? the pilot? [the pilot] phase. The Pilot Phase. So. Yeah. Could you tell us one story that things went really well or things went really badly?

 

[Anita] Sure! So, there was a product that as used in the last round where I felt like, it was really… we saw a lot of interesting things happen, hum… but they’re all like lot of qualitative metrics. So it”s called Brain Quake, and actually the CEO and cofounder, he’s the… he actually was an LDT graduate, hum… but…  it’s basically this game on an iPad or… whatever, where you can play… you have little keys and you have to line the keys to get this little creature out of a jail essentially, but,  what was interesting is when you turn the gears it also kind of teaches an eight number sense, so it’s like, this interesting puzzle that kids kind of enjoy solving. And so he was using this in some classrooms in the Bay Area. Also one in Gilroy and this teacher was a special ed teacher. And so she was kind of showing them this and so… What I think was really, really successful that I found was that for one of her students, they had trouble with like motors skills. And so one of the skills that they had trouble with was kind of like turning the gear on the iPad. hum. But the student actually learned to turn the gear like to the left. Cause you can turn it both ways and they were able to like, learn that skill moving like, doing a different motor skill than they had before because they really wanted to play the game. And so I thought that was like a really wonderful example of how technology can be really inspirational or like really helpful versus I think their other, you know. Well, lots of examples in literature where technology just like, you know, it’s just a replacement for something. Hum… and so… I think also a lot of the other teachers who worked with that product really, their students really enjoyed it, ‘cause it is really engaging and they were making like, connections between the fact that, you know, I’m doing math and they could see h… they could understand that, you know, if I redid this into an algebraic expression… like they were coming up with that terminology and then they were like “we could just rewrite this into an algebraic expression”. And I think that was a like a really wonderful example of a product that went really well.

 

[Lucas] Did that product end up being hum, adopted or implemented in the school [Yeah, so…] effectively?

 

[Anita] That just happened this spring and I don’t think it has been yet. Hum… they’re still also like an early, you know like an early stage company so they’re, I think they’re still growing and figuring out exactly what it looks like. But I think that we are trying to support companies in that way. And we’re still figuring that out. So…

 

[Lucas] And was there ever hum… a big problem in a pilot?

 

[Anita] Yeah, let me think… typically I would say the problems that we run into in a pilot is where, companies are like working with their teachers and it’s going well but then sometimes companies get really… I guess it depends, now that I think about it. In the fall of last year the was one where the company like, the developers got really busy cause they’re just, start-up just took off. And so they became pretty unresponsive with our teachers. The teachers like, emailing me, and I’m like trying to get in contact with it, and so typically when there’s not communication between these parties, it would… the pilot would not be as successful as it could because they weren’t communicating. Things weren’t changing. Hum… In the spring, one of the things… The biggest challenge we found was actually testing. So testing was happening for the first time for Common Core and so what would happen was these teachers that email me, being like “I can’t get a hold of the Chrome Book carts”. Like, they just couldn’t get access to the technology they needed to run their pilot. And so… one teacher… her district told her this before she like committed to the pilot. And she just like pulled out. Like she’s like “I just can’t do this” like “I don’t have access to these, to like, the technology that I need”. Hum… But some other teachers, they were like, one of them told me she had to like had to go to the principle and like beg to use the Chromebooks on a like… on a day that they weren’t being used, but, I think because it wa the first year of testing, a lot of schools and a lot of districts were very, hum, protective of their technology cause they just wanted to make sure it went smoothly. And that totally makes sense. And so… for… because the… the testing when it… kind of… varied like when this would happen for the different schools but, some schools were more extreme in like saying, you know, were just gonna use it for all this quarter… like we… like, you know, we’re gonna lock it up and then others were like “Well… we’re not testing now so feel free to use it” So… That was a big challenge in our pilot this spring.  

 

41:55

 

Interview 2

 

[James] So, do you, do you se-, do you see other people sort of coming in and filling that spot? Um bes- I mean, iHub, right?

[Anita] Yeah?

[James] Um, has anyone else tried to do that or…?

[Anita] Yeah, actually, that’s a good question. That makes me think of something else. Uh, the county sometimes does it. 

[James] Mhm.

[Anita] So especially in California where there’s small districts, uh, there’s a district in Santa Clara County that has like two schools.

[James] Mhm?

[Anita] There’s a couple districts in San Mateo County that have three schools, and so this is like, you know two elementary schools, one middle school? They, the county, can be assisting in kind of helping develop tech plans. Maybe not per se rolling out of… it may not be rolling out of specific technology but they kind of help support infrastructure. They may also help with, let’s say if three districts in a county want to purchase a certain product, and they’re really small districts, so like, so their total makes like eight schools?

[James] (laughs)

[Anita] Right? The county can kind of help facilitate a purchasing plan with the other schools, so that way prices are more fair for the, that, those schools.

[James] Yeah.

[Anita] So I guess the county does sometimes play a role, but it depends on the county. It depends on the county leadership too.

[James] Mhm, and have you noticed um, how well they do?

[Anita] So, I think San Mateo is one of the counties that does well in this. Uh, I know that they’ve had some, they’ve definitely assessed their schools in San Mateo County two years ago for tech and how in-, the infrastructure is. I actually have a website that I can share with you about that. But it tells you like the ratio of how many like IT personnel there is to students, like but it doesn’t say anything about software. I think it’s simply in tech- tech adoption, it’s simply like the infrastructural side, not like software.

[James] Yeah.

[Anita] But that’s important, right?

[James] Yeah.

[Anita] You can’t have that without, you can’t have software without hardware, so, you know.

[James] Yeah, no, that is very true. Um, and so in, in cases like, like San Mateo, um what do you think iHub like sort of adds to the mix then?

[Anita] Yeah, so, I think because San Mateo County isn’t per se te-testing software,

[James] Mhm.

[Anita] Our goal is really to help support software implementation. 

[James] Mm.

[Anita] And seeing, you know, what works in software, what works in edtech that way, um, I think they’ve done a lot of the, the other research. And also, I think it’s changed a lot. Like two years in the tech, edtech world is a long time.

[James] Yeah.

[Anita] Like two years ago, it was, it, the landscape looked different, like Khan Academy like different. Some of these startups don’t exist, right? So, or maybe they did and they folded.

[James] (laughs)

[Anita] Like there was something, Amplify, I think?

[James] Oh,

[Anita] So, =

[James] Joel Klein.

[Anita] = So, so I think there’s a lot of like change in the world? So, I think that’s a big, I think you have to re-, continually assess in order to in order to have like a better read. So I think, I have, helps in, it can help in supporting the county. We’re trying to create like a systematic way to like do that, I guess, is assess kind of the edtech side infrastructure but also create a model so piloting of edtech, especially new edtech, is easier, and then there’s a route that’s more clear-

[James] Mm.

[Anita] -to the question for what works and what doesn’t.

[James] So, sort of, um, so would you say, so just to repeat, sort of like to rehash-

[Anita] Mhm?

[James] -what you’re saying, it’s, you’re sort of setting like this, um, like the front runner, right? You’re setting like this sort of example?

[Anita] That’s the goal. I think is to some sort of model that you can follow, like implement, like a flow chart almost. 

[James] Mhm.

[Anita] But we know that school districts are different, so there probably will be some choices or wiggle room in some of these decisions. But I think that’s the goal.

[James] Mhm. And, um, and you may have already touched on this. So what do you think is the, (pause) what DO you think is going to be like the iHub sort of like place in the world?

[Anita] I think the research part is really important. So I think school districts can always fund a lot of the research, and I think if we, we now have a process for matching and school support. But, I think the research cycle really brings it all together, so if we are able to create a strong research process-

[James] Mhm?

[Anita]  -uh, then it will be able to, schools will be able to kind of use the research process and say like, “This works, we should use this. This doesn’t work. This is why.” Give them feedback, hopefully they’ll change, the companies will change.

[James] Mhm.

[Anita] And then, move forward.

[James] Yeah. So how, how do you see that sort of unfolding?

[Anita] How do what?

[James] (in a clearer voice) How do you see that unfolding?

[Anita] Yeah, that’s a great question.

[James] (laughs)

[Anita] That research side is always the side that EVERYONE in this field struggles with. Um.

[James] Mhm?

[Anita] I think right now, there’s more and more literature on it, so we kind of start from there. We also work with different partners, so we’re kind of thinking about, uh, I know another group is doing design-based implementation research, so DBIR research. Uh, but it’s kind of the goal that everyone in the group—so the teachers, the adaptive helpers, the students—everyone plays a role in kind of designing, kind of giving feedback. It’s like implemented in the classroom, but they kind of altogether give feedback so that, over time, the product gets better.

[James] Mhm.

[Anita] Um, but in the world of research, (3) I think we’re kind of, WE are kind of on the exploratory research side slash design slash implementation side, so we’re like earlier. And so I think, we’re, we’re still learning a lot about the field-

[James] (laughs)

[Anita] -about what that looks like. So we have things in place, but I think we’re trying to make them more robust.

[James] Mhm. (in a softer voice) Very cool. (in a louder voice) Um, and so, we went rogue for a little while there. Uh, so, let me backtrack a little bit. What do you think, what do you think is sort of the ideal relationship between um, edtech company and school or educator?

[Anita] Hm, that’s an interesting question. I mean in my head an edtech vendor is a provider, right? So they should be providing some service that fits a need that a school has or a teacher has or a student has in some way.

[James] Mhm.

[Anita] So that’s how I imagine the relationship is, is that they’re providing something to the student. But at the same time, obviously, that providing something, it, it’s a ben-, there’s some benefit to the student, or teacher, or classroom that it brings (…) efficiency? Right? It could be classroom efficiency. It could also be like differentiating or like being able to adapt to each person where they’re at in the classroom. Um, but I think there has to be some sort of benefit to it. 

[James] Mhm. 

[Anita] Yeah.

[James] So that’s from the um, that’s from the edtech company to the educator.

[Anita] Yeah.

[James] And what about vice versa? 

[Anita] I think in my head, it brings I think it helps, I think it just, I think they’re able to give feedback? I don’t know, I never thought about that, it, as much that way. But I imagine that if a product is doing well, then it also provides, like over time, it’ll provide feedback, and that product will continue to get better, and it will continue also to grow in usage around the area.

[James] Mhm.

[Anita] Where, not necessarily around this regional area, but in the area that it’s being used.

[James] Mhm. Um, and would you say sort of, I mean, so iHub, your, your core mission, right, um is to, your value proposition was to, you know, sort of facilitate this interaction=

[Anita] Mhm?

[James] = Uh, do you think you, how, how would you want to like, I guess, how would you want to facilitate that ideal relationship, um?

[Anita] Yeah, so I think there are ways that we’re still working on to figure out exactly what that looks like, especially thinking like five years in the future? 

[James] Uh huh?

[Anita] But for now, I think no one kind of facilitates these relationships so we take the place to do that.

[James] Mhm.

[Anita] Uh I think in like ten years, ideally, we wouldn’t have to do that because schools and districts would be doing that internally, right. 

[James] Yeah.

[Anita] They would be able to set aside part of their budget to pilot products, not to pay the products, but maybe to pay the teachers and, or, maybe they don’t even, like, it’s part of the integral process of how you’re teaching so it’s related to the professional learning that happens.

[James] Mhm.

[Anita] In school. Um, and then they would use data collected from these pilots as decision points for whether or not to purchase the product, and then if they don’t purchase the product, or even if they do, kind of give that feedback to companies so that companies can change their product to be more appropriate for the education world. 

[James] Mm. Can you elaborate a little bit more on that, actually? It’s uh…

[Anita] Yeah, so I think we just want to make sure the, the products are really relevant to students, right. 

[James] Mm.

[Anita] And so, that’s the way we do it, is that you get feedback from students and teachers, but I think those needs change, right? Each year, this year, the needs are different than last year, because this year you’re using Common Core, and last year, maybe, it wasn’t as big of a, actually it was really big last year.

[James] (laughs)

[Anita] Maybe two years ago wasn’t the same, right?

[James] Mhm?

[Anita] So I think that that’s a big…

[James] Big?

[Anita] Difference. Yeah. 

[James] Yeah.

[Anita] So.

[James] Um, and is there other things that you, is there um, anything you would either do, well, what would you want to keep the same or would you want to do differently, or would you want to sort of sustain? Do more of?

[Anita] Yeah, I think there’s a lot of things that are good right now for matching process. I think that it’s really helpful that we have lots of connections to districts, so I think that we need to continue to maintain those relationships, but also continue to grow them.

[James] Mhm?

[Anita] Uh, I think starting with the problem of practice. So having teachers kind of come in with the need they want a product to use to fill-

[James] Mhm.

[Anita] -um, is important. But I think maybe something to change on that front is also how you help them define that problem of practice, because I think some teachers come in and say like, “We really want differentiation lists in math in third grade.” But then when they finally see the companies that are selected, they’re like, “Oh wait, we really want to do something else.” And it’s like, was that really a need of yours? Or were you just kind of saying that because it sounds like a need that everyone’s talking about?

[James] Yeah.

[Anita] Um, and so I think helping teachers really focus on a problem of practice, that’s something that we’re learning to work on, but (clears throat) this year at least, it was at least stated, versus in the past, it wasn’t even stated at all.

[James] (laughs)

[Anita] So continuing to going, going down that path is really important-

[James] Mhm.

[Anita] -in the matching process slash vetting process. Um, I think something that has been good especially in the Bay Area is working with early stage companies =

[James] Mm.

[Anita] =and so we work with early stage companies to, you know, it, it’s a good place to be for that. So I think for us, that’s a really good niche.

[James] Mhm.

[Anita] Um, but I do think as time goes on, something that needs to kind of change in the work is that we have to support both early stage companies but also like mid, like later stage companies, so that you know, teachers change their practice or you know, like, it, is it really affecting students if it’s in ten classrooms, right? Not really.

[James] (laughs)

[Anita] So like, I mean, it does, but, you know. It could have a wider, wider effect if there are more, if there are more, it was in more, if it’d shown that it actually should be in more. 

[James] Mhm.

[Anita] So. Other things, I think research similarly like, we have some protocol, some usability research, but I think it would nice to step into a little, especially for later stage companies, how do you help with maybe, more specifically efficacy research? Which is how well or how well this product’s meeting a need that it said it’s meeting, that it said it’s trying to fix.

[James] Mhm.

[Anita] So, (in a very soft voice) that’s one, (in a slightly louder voice) one thing I guess. 

[James] Yeah.

[Anita] Mhm.

[James] So that, this is actually quite interesting the um, I guess for me, the, the idea of you know, early versus late stage, right?

[Anita] Mhm?

[James] Um you, you brought up that that’s sort of, you see that as your, as your niche, right? Is the early stage. 

[Anita] Yeah.

[James] Um and I can, I can guess as to why, but can you, can you tell me a little bit more?

[Anita] I think one of the big challenges is, in edtech, it’s like there’s so many edtech companies so it’s how do you kind of bring to the surface the ones that are promising? So, I think our goal in vetting the companies is to bring to the surface some of the more promising early, like, edtech companies and kind of help them go from early to mid. I think there’s a big jump from those two and some people don’t (laughs) don’t make it. 

[James] (laughs)

[Anita] Actually a lot of people don’t make it. =

[James] (while laughing) A lot of people don’t make it. 

[Anita] =Yeah, a lot of people don’t make it. 

[James] (laughs)

[Anita] Most. So.

[James] (while laughing) Yeah. 

[Anita] I think that’s the goal.

[James] Yeah. 

[Anita] Yeah.

[James] And, and so when you, when you call your, your niche, that sort of implies like a competitive advantage, right? Um for iHub=

[Anita] Mhm.

[James] =specifically? Um and so, um yeah, I mean, yeah. Could you elaborate more about this, that, that idea?

[Anita] Yeah, and I think, well I think that’s mostly because right now, schools DO need edtech products. Like they des-, they want it. They want products that do x, y, z. Um but they don’t really know how to go about and find them. So I think that that’s why we’re working with early stage companies because I think it’s, it’s possible to find one now that meets the needs of many teachers and kind of help it kind of just move along.

[James] Yeah.

[Anita] And it’s quote unquote adoption learning. Which, I use that word. I don’t really love=

[James] (laughs)

[Anita] =the word “adoption.” I think it has a lot of loaded meanings, but.

[James] Um (laughs).

[Anita] Yeah. 

[James] And, and wh- why, why do you think iHub is uniquely sort of in the position to do, you know, to like really understand that?

[Anita] Mhm? I think it helps that with a lot of partnerships we’ve previously formed, I also think that since we’re neutral, we’re not a school, we’re not an edtech company. I think that that puts us in a position to facilitate those relationships well. 

[James] Mhm.

[Anita] Uh I think if we were a school, then we’d be constantly thinking about like, “how much does it cost?” Like, uh other things, that I feel like schools HAVE to think about. 

[James] Yeah.

[Anita] Which I mean, are very important. We also keep those in our head when we’re recruiting, but I think it also gives us some neutrality, I think, as well, so. And with, the other side is that we’re not really affiliated with edtech venture funds, or like incubators, right. We have partnerships with them, but we’re not like soliciting. Or we’re not trying to make a sale, so school districts are more willing to work with us because we’re not like, “You have to use this product because we’re going to like make money from the fact that you use this product.” =

[James] Mhm.

[Anita] =It’s just like, “Oh, from the tests that WE did, and the research that we, research that we’ve done with other teachers, they really enjoy this product specifically for these things.” So, yeah.

[James] Um, and do you think, do you think, or how hard or easy or whatever do you think would be for, not a competitor, but like another sort of um, iHub model to come up and sort of, you know, also add to, add to the ecosystem?

[Anita] Yeah, so, the compa-, the other groups that we work with kind of do similar tests they run. We call them test beds. They have similar test beds. But the three that have been funded so far, we focus on early stage, I would say, iZone kind of focuses on design implementation research, and then Leap focuses on impact or efficacy research-

[James] Mhm.

[Anita] -so we kind of do have similar people in the space, but not here in the Bay Area.

[James] Yeah.

[Anita] I do think there are more and more coming up. I think (3) it would, I mean, it’s good to have more people doing research about this because no one knows how to do it well. 

[James] Mhm.

[Anita] So, I think that would be a good, it would be good in some ways, obviously. And then, obviously, for, in other ways it would be more competitive for us.  

 

36:54:04

[Ana] We’ve been talking a lot about the opportunities for educators to give feedback back [yeah] to the startups to improve their products. Are there any opportunities for the entrepreneurs and the educators that are involved in this partnership to give feedback back to SVEF?

[Anita] Oh yeah. There are. I never mentioned those, but there are lots. Every time we have meetings, we are very…open about that. And I also think, we have surveys, so there’s a lot of, we send out a lot of surveys about a lot of…different, specific, different…happenings, and so after…orientation happened, there was a survey that was sent out about that. After the Pitch Games happened, there was a survey about that…I also think that during the rounds if we have strong relationships with teachers, which is typically the case, then teachers are very open with us. At the end, I’ll be like, you know, we’d love to hear your feedback, and they’ll just tell us, you know, we’d love it if there was this, this, this, this. And that’s been helpful, and we’ve made a lot of those changes based on teacher feedback. Like for example, the reason why teacher teams are at a school site this year instead of from all different schools is partly ’cause it makes sense, I think, to scale, but also because we that was one of the big pieces of feedback that was given from the beginning. So.

38:15:00

[Ana] Great, that’s awesome. Um, another question that comes to mind is, you mentioned that, uh, you’re working with earlier stage companies as opposed to [yeah] later stage companies or startups. Um, when you think about how that impacts, or how that affects the actual impact of a product in a classroom (two-second pause), what comes to mind?

[Anita] I don’t know if I understand=

[Ana] Yeah, let me totally rephrase that. (observer sneezes) Bless you. Do you think that working with earlier stage startups, as opposed to later stage startups, impacts the, or affects the impact that a company can have on learning in a classroom?

39:10:08

[Anita] Uh. Could. I mean, it depends right ’cause if (pauses for three seconds) yeah and no, I mean, it depends…if your thinking about it like with time or if you’re thinking about it like short term or long term I guess. So one of the organizations that we work, a different (inaud) does this work with mature companies. And so what they do is they work with schools who have already kind of been using specific products in the classroom, and then they do very specific research using…data points and observation and kind of tells these schools…yes, this products works or…no, we don’t really think this product works…And I think that’s important for schools to know whether or not they’re paying for something that doesn’t bring any learner outcomes, right…or isn’t helping their teachers…adjust to 21st century learning or you know, just changing the way they teach…I think for us the goal is that, you know, we kind of help with this, this market where it’s a little, it’s not very defined…no one is really guiding these people. So they just come up with an idea, and they just kind of throw it out. And if it works, that’s great, and if it doesn’t, then not. But I think we’re kind of hoping to pull out some of those that work. But I think ours, the goal would be like it’s a longer term. You would find out over long term if it works versus something that’s more like yes that works or no that doesn’t work right away. So, yeah.

[Ana] Do you think there are any negative repercussions of trying out products that are so early stage on real learners?

[Anita] Yeah, good question. I think that it’s definitely a possibility. I would say that, I would say that I think teachers who we pick, we try to pick ones who are very…very experienced with using tech in the classroom and so I think that you, you find that teachers who use tech in the classroom, you…it’s like their instruction is different and so you know, depending on, I think they can make learning happen with almost, with different, different pieces. And so I think that’s one way we kind of counter from it. But it is true. But I think it’s like would you rather have teachers do that without any oversight as to whether or not that works? Or would you rather have them do it with some facilitation as to whether or not it actually, there’s some…you know, conclusion at the end like yes, it works or yes, it doesn’t. I think teachers sometimes already do that in the classroom. So. Yeah.

41:45:01

[Ana] Interesting. (eleven-second pause) I think that it’s interesting that you said that ideally in ten years, an organization like SVEF would be out of business=

[Anita] Well, I would say that the iHub Program. (both nod) Yeah. We do a lot of other shit. But=

[Ana] Yeah.

[Anita] I don’t know if that’s…an organizational goal. I think that would make the most sense, right, ’cause I think, in my head, if you…if you identified a problem and you’re able to solve it…that’s great. (laughs)

[Ana] Yeah. Do you, do you think that’s actually possible?=

[Anita] Going to happen? I don’t know. I think it’s hard to say because I would say I don’t know enough about school districts and about school…counties, offices, to be able to know whether or not they’re functional. There’s a lot of bureaucracy, I think, that comes up when you work with the county and work with…there’s so many different needs and so many different people kind of working on it that sometimes…they can’t, they’re unable to kind of do certain actions because of different reasons, whatever they are. So I don’t know. But that would be like, in an ideal world to be able to give a model to a school district, to any school district and be like if you kind of adopt this to your school…this is a way you could pick technology for your classroom, classrooms, and also give feedback to these developers, and then developers would also have a clear path to entry, which I think is a big issue in the market.

 

Tech 4 Learners – Final – Reflection on Design Project

1. Product/prototype

Our target learner was Achu, a calm and smiley 12 year old boy who is fond of playing basketball, watching HotWheels videos on YouTube and painting. He follows instructions well yet rarely initiates activities on his own. That is also the case with communicating with others, unless he needs to go to the bathroom or needs more paint for example. He responds to questions but is not always sure about his answer. He often repeats the last words heard when answering. Our impression was that he knows the answer yet has trouble externalizing it appropriately.

We immediately focused on the idea of helping Achu initiate verbal communication in so far as it would help him express his desires and needs more effectively. Our initial brainstorms revolved around using technology such as VR and games that would prompt him for verbal responses or would require verbal input to be utilized. We generated a few statements that helped us focus on the learner’s needs and the solution:

How Might We

  • HMW help him say more words?
  • HMW motivate him to want to communicate?
  • HMW stimulate him to produce original words?
  • HMW make him comfortable sharing words with others?
  • HMW make him feel like his words have value?

This lead us to the following Needs Statement:

“Achu is a shy pleaser who needs to practice creating his own words in order to facilitate him communicating with others.”

To achieve that, we created a low-resolution prototype which consisted of playing a video with no sound on the laptop and prompting him to narrate what was going on. The final goal was to have a video with his voice narrating the events. We were able to engage him in the activity and on a few occasions, he actually generated new words, when prompted. We felt that the prototyped achieved some of the initial goals but there was still something missing to be considered truly effective.

After this initial test, we were able to get feedback from Marina. She thought the prototype worked but partially because narration is a technique that has been used extensively before by his speech therapists. He generated new words but still needed prompting from us. She also stimulated us to think more about how could he transfer what he learned within our product to his everyday life. With this in mind we evolved our learning goal to:

“We want Achu to learn the value of communicating with others.”

After presenting our finding from our initial prototype, we dug deeper into what was missing and discussed some more potential solutions. We finally connected the idea that the value of communication is shown more evidently when helping others. We could use a teachable agent in the product and elicit the Protégé Effect (Chase, Chin, Oppezzo, & Schwartz, 2009). We built on the idea that while Achu might not find it always natural to speak for himself, he might find it compelling to speak to someone to help them.

We introduced Tom in our prototype. A blind cat who asked Achu for help figuring out what was on the screen. We scaffolded the experience by creating a simple learning progression. We start with a single word on the screen. Tom asks Achu what is the word. Once Achu says the word, Tom thanks for Achu’s help. We wanted to ensure that we were ‘valuing the process and not only the final result’ (Dweck, 2007). After 3 words, we moved on to 3 short sentences, 3 pictures, and finally 3 videos.

On the day of the test we were unsure about the results and therefore also took a few other activities to gauge Achu’s engagement and levels of communication we could elicit from him.

We had him play with an App that records what you say and plays it back with a funny voice through a character. He was soon bored with the activity.

We moved on to observing him assembling a jigsaw puzzle with a phrase instead of a picture as the complete set. He was very fast at combining the scattered words into a perfect sentence. He was also prompted to read it out loud, which he did with ease with the exception for one word he did know how it sounded. He seemed embarrassed but was reassured by the teacher that it was ok to say that he did not know – which he finally did. This episode showed us that the Protege Effect might actually work on him since he would not want his ‘friend’ to not know something.

Our final activity prior to testing our prototype was to engage him with text messaging. He clearly understood what was going on and responded by typing onto Alex’s phone while I was in another room with the other phone. His trouble was dealing with the small keyboard on the phone but it showed promise in that he might engage well with this form of communication with a larger keyboard.

Finally we tested our prototype. Achu was immediately fond of Tom, the cat and rapidly replied to his prompts. The words, sentences, and pictures we verbalized promptly. The video also succeeded in promoting verbalization yet it took some more time for him to think about what to say. Once he did it and Tom thanked him, his energetically and positive reaction was priceless and strong evidence that the Protege Effect worked. He even clapped his hands and said “Achu is helping the cat!”. 

What surprised me the most during the process was how a small adjustment in the product resulted in such a big change in the levels of engagement. The process of narration was still the same, yet the purpose and motivation it was made clear to him. Narration for narrations sake did not have value for him. Helping Tom did. It also reminded me that we were eliciting in a small way Joint Media Engagement (Takeuchi & Stevens 2011) between Achu and Tom the cat. They were both consuming media and helping each other out – feeding off of each other – learning from each other. A lesson learned that I will carry onto all my future design processes.

One thing I felt was missing in the process was a greater level of engagement with Achu’s teachers, Marina, and eventually his parents. To fault was lack of time, schedule conflicts and few attempts on our part to communicate more frequently with the stakeholders. Yet for the purposes of the course and the learning process, the interactions were fruitful and thought provoking leading always to new iterations and fine tuning of the product.

2. Collaboration 

The collaboration within our team was effective. I assumed the creative and technical role while Soren looked at our product through a more pedagogical lens and Alex with the documentation and write ups. It was a fruitful process where I felt each one in the group contributed effectively and pulled their own weight throughout. My multimedia skills helped us to rapidly create the prototypes, presentations and video. Soren’s teaching background helped us selecting the appropriate language, level of complexity, and scaffolds towards learning. Alex helped us with summarizing and documenting our meetings, tests, and findings.

Our process was very much guided by our class activities. We met only twice outside of class, not counting our three visits at OMS. This does not mean that we did not communicate outside of class. Through Google Docs we constantly collaborated with the elaboration of the presentations, texts and ideas. This demonstrated the effectiveness of the scaffolds we received as designers from our professor as well as our groups efficiency to generate ideas and agree with the path to take.

Next time around I will certainly work again with all the collaborative digital tools we used to document and brainstorm our ideas. I will also take the lead in creating the multimedia content since it is something I enjoy doing and see how valuable it is. As far as doing things differently, I would only wish to have more time to interact with the stakeholders and the learner. I will push harder to communicate more effectively with the intended audience and try to get more insights as to what the learner’s needs are.

3. Learning

My learning experience during the project was more one of trying to apply learning theories to the project than trying to be overly creative, as was the case in some previous projects I’ve worked on. The challenge was to design for a learner which we knew very little about, but using the educational lens we were able to apply and test learning theories with a certain success. A core motivator for me was  actually a little bit of the Protege Effect mixed with the Four-Phase Model of Interest Development. Looking back at the quarter, I noticed that much of my effort towards creating a better product was drawn from wanting to please the other, to teach, and to provide a benefit to his life. Not to mention the desire to please the teacher as well in the process. As for the interest development aspect of learning, I feel I reached Emerging Individual Interest, close to Well Developed Individual Interest – depending on if I am able to evolve the product in the future.

More importantly I believe I improved my skills and techniques of rapid prototyping. The pressures of creating a functional prototype to be placed in the unguided hands of a user were removed by the “Wizard of Oz” technique. It allowed me to create more freely and rapidly. It allowed me to continue thinking freely about potential solutions instead of being vested on a product because of all the time I spent in detailing a quasi-product. Yet I also learned that being able to design this way also requires some previous experience with prototyping. You must be able to predict user’s interactions that might completely break the desired effect. Therefore, even in a free-formed rapid prototype has a Minimal Viable Product.

References

Chase, C. C., Chin, D. B., Oppezzo, M. A., & Schwartz, D. L. (2009). Teachable agents and the protégé effect: Increasing the effort towards learning. Journal of Science Education and Technology, 18(4), 334-352.

Dweck, C. S. (2007). The perils and promises of praise. Kaleidoscope, Contemporary and Classic Readings in Education, 12.

Hidi, S., & Renninger, K. A. (2006). The four-phase model of interest development. Educational psychologist, 41(2), 111-127.

Takeuchi, L., & Stevens, R. (2011). The new coviewing: Designing for learning through joint media engagement. In New York, NY: The Joan Ganz Cooney Center at Sesame Workshop.

Tech 4 Learners – Final – Notes

National Education Technology Plan

  • Focus on technology but need to use it for PD

Understanding by Design

  • Backwards design or backwards planning
  • Clear learning objectives
  • How could we incorporate game design practices into education?

Computer Criticism vs. Technocentric Thinking

  • Ed Tech is not the silver bullet – must come with pedagogy and PD

In-Game, In-Room, In-World

  • Kids learn plenty from each other
  • Kidification of education

The Perils and Promises of Praise

  • Growth mindset
  • Constructive praise – effort and process not ability itself (you’re so smart!)

Four-Phase Model of Interest Development

  • Model
    • Triggered Situational Interest
    • Maintained Situational Interest
    • Emerging Individual Interest
    • Well Developed Individual Interest
  • Teacher’s interest is probably best predictor of effective teaching
  • Teacher’s role is to provide:
    • Positive feelings
    • Generate curiosity
    • Provide opportunities
    • Guide on research

The New Coviewing

  • Joint Media Engagement
  • Design Guide
    • Mutual engagement
    • Dialogue inquiry
    • Co-creation
    • Boundary crossing
    • Intention to develop
    • Focus on content, not control
  • Challenges
    • Parents too busy
    • Parents unaware of needs
    • Don’t enjoy the same content
    • Desired interactions not always triggered
    • Little continuity into other family activities
    • Distraction are always present
  • Design principles
    • Kid driven
    • Multiple plains of engagement
    • Differentiation of roles
    • Scaffolds to scaffold
    • Trans media storytelling
    • Co-creation
    • Fit
  • “What goes on between people around media can be as important as what is designed into the media”

Teachable Agents and the Protégé Effect

  • Care more about pleasing others than oneself, so having someone you need to help enhances learning through teaching this person

Tangible Bits: Beyond Pixels

  • Tangible User Interfaces

Horizon Reports

  • re-teaching our teachers how and what to teach

Learning Environments – Week 10 – Assignment

A little late but here is my response to Paulo Freire’s “Pedagogy of the Opressed”


“In order for this struggle to have meaning, the oppressed must not, in seeking to regain their humanity (which is a way to create it), become in turn oppressors of the oppressors, but rather restorers of the humanity of both” p. 28

The idea that the oppressed become the oppressors is a powerful one and evidenced throughout history. It is actually happening now in Brazil with the workers party which when finally got into power, became corrupt themselves and geared all their efforts to stay in power as opposed to attempting to regain the country’s humanity. “Their ideal is to be a man; but for them, to be man is to be oppressor.” 

Yet the problem is deeper and engrained in the educational system which has been lowering standards to enable children to “progress” within the school system without effectively progressing in learning – similar to the “No Child Left Behind” here in the US. Critical thought must be taught to the students, which implies you must teach the educators to do so as well, or else it simply fails. By lowering standards, you perpetuate an oppressed workforce who is incapable of thinking critically and therefore deprived of a chance for instigating change. 

“The pedagogy of the oppressed is an instrument for their critical discovery that both they and their oppressors are manifestations of dehumanization.” 

“Education as a practice of freedom – as opposed to education as the practice of domination.”

Intro to Teaching – Final Paper

EDUC 213:  INTRODUCTION TO TEACHING
FINAL ASSIGNMENT
Lucas Longo – Dec 5, 2015

Description of the Lesson

Mr. Fischer’s lesson utilized several pedagogical techniques to enhance the learning objectives, stated at the beginning of the class. The lesson started out with giving students input about the subject with some direct teaching and lecturing (3:37). The teacher pulls from previous knowledge the students poses when mentioning cross examination (3:10) and recalling that the have done this process before. Asking the students questions about the case, the teacher assesses informally their knowledge and summarizing what is being said in the class on the blackboard by drawing a mind map. Scaffolding the students with prompts, probes and further questions, the teacher was able to extract deeper and more precise responses.

Mrs. Gomez then switched gears into an interesting activity of following her directions in Spanish. She was able to model to problem of communication, identified during the class activities, effectively and realistically. She was able to generate an affective response from the students (8:17) allowing them to feel what the problem was and hopefully promote metacognition. This activity was preplanned by the two teachers and showed a great level of PCK application to the lesson – they understood the content and created interesting ways to convey this to the students.

In preparation for the mock trial, the students broke out into groups. They moved from independent practice they’ve had with the content towards guided practice. After organizing themselves into the roles each wished to play, they discussed the subject matter with each other while the teachers moved from group to group facilitating discussion (9:40), assessing informally their knowledge about the subject and coaching them towards the trial. The teachers understood the concept of ZPD providing scaffolds for the students while not overdoing it by giving out all the answers. “I’m not going to do it for you. You know how to do it” (13:00). This activity then culminated in the actual reenactment of the trial and a debriefing session to summarize what the students had learned from the exercise.

Application of Course Content

Instructional Planning & Assessment

The teacher’s objectives for the lesson seem to be to illustrate to the students the importance of history and the connections that can be made with the present through the law and the court systems. They wanted the students to understand how does the US Supreme Court’s decisions in the past affect our lives today and how labour is an important aspect for understanding our past. By exposing the students to the Amistad case, topics about slavery, justice, and communication were covered, even if indirectly.

The teachers planned the instruction with care with distinct types of instruction starting ranging from direct teaching, prior knowledge assessment, group discussions, facilitation, modeling and peer-to-peer teaching. We can say that the lesson moved effectively up the levels of Bloom’s Taxonomy starting by recognizing and recalling facts about the case, understanding what the facts meant, applying and analyzing these facts in order to prepare for the trial, evaluating what was important or not and finally creating the mock trial reenactment. A full progression that visibly engaged students.

The teachers used informal assessment to judge if these objectives were met. By creating the mock trial exercise, they were able to understand how well the students grasped the subject. An important part of this assessment was conducted throughout the group sessions where the teachers could hear and interact with the student’s knowledge by facilitating their discussions. There was no formal assessment in the sense that there was not a test nor a written documents each student had to present.

Knowledge for Teaching

The teachers did show a wealth of content and pedagogical content knowledge in preparing for this lesson and teaching it. Their content knowledge about the Amistad case was necessary to be able to explain the case, focus on key issues, and provoke thought and questions about relevant facts. Without this knowledge they would not be able to guide the students in the process of creating the mock trial.

In terms of pedagogical content knowledge, the teachers understood the benefits of utilizing several techniques to pull from the student’s previous knowledge, elicit content association, and even draw upon their funds of knowledge. A great example of this was the activity where instructions were given out in Spanish – it situated them in a context that might very well be present in everyday life. It was so effective that it caused transfer when the student, playing the lawyer, used the same technique to ‘prove’ that the witness did not speak Spanish.

Their PCK extended to having the students model the trial, preparing for it using independent practice, group activities, and metacognition. At the end of the video one of the students summarized it well saying that the ‘trial got all the information they already knew and made it more realistic’. Without PCK, I believe the teachers might not have been able to crate such a rich experience for the students.

Differentiation

The lesson included several ways for the students to engage, process, and construct ideas with the content. They obviously read about the case before coming into class, heard a lecture about the case, responded to the questions the teachers posed, worked in groups to dig deeper into the content, were scaffolded during the process of preparing for the trial, and finally created and enacted the trial to expose what they had learned. This diverse set of activities ensured that the students had several opportunities to engage with the content, ask questions, and absorb the content throughout the exercise.

An important part of their differentiation technique was illustrated in the group activity where the students were learning from their peers while being scaffolded by the teacher’s facilitations and interventions. The trial also aided in the sense that the students were able to observe each other’s performances which illustrated facets of the issues based on in-depth analysis of each other’s roles in the trial.

Language and Culture

The teachers were able to build off of and support the students’ community and cultural knowledge by choosing the case to be studied. The Amistad case dealt with issues that were culturally and historically relevant to the class since the majority of the students were African American and Latino, including the teachers themselves. With this, the subject matter was directly relevant to the students and taught them about their heritage.

English Language Development was supported by the teachers by showing the students the effects of not knowing another language and the communication problems that it entails. They were aware that some students did not speak Spanish and used this to their advantage and illustrate this point. They also explained in Spanish some concepts to a student who needed it. The other students also were shown helping each other in Spanish to clarify some concepts. In this sense, the whole classroom culture was geared towards accommodating for bi-lingual students.

Classroom Management and Engagement

The classroom was well managed and the students well behaved, showing that the teacher had established control and respect. The norms of engagement seemed to be well established in the sense that students were comfortable in participating, answering questions, and working in groups. During the group activities, the teachers walked around verifying if there were any doubts, encouraging them to ask questions, share with their neighbors, and discuss their ideas. The teachers also allowed great freedom for the students in promoting individual and group decision-making, making them think and build upon each others knowledge.

The group activity also allowed for the teachers to mediate discussions, reflect on the subject matter and finally demonstrate what they had learned in the mock trial. They also ensured that they worked with the students to draw out the underlying issues about the case as well as making sure that they were able to follow the mock trial’s proceedings and constraints.

By ‘making it real’, they were also able to obtain full class engagement and participation. The students were clearly interested in the subject matter and put in real effort in making sure that everyone in the team was on the same page, had no doubts, and were sure about what they had to do. To the mock trial was a formidable way to assess the students knowledge and keep them motivated to present their best work.

Overall analysis of the strengths and weaknesses

The greatest weakness of this lesson I found to be the debriefing session. Granted that we might not have been presented its entirety in the video, yet it seems like it was short and shallow. I would have spent more time with the group trying to pull out what they had learned, what they had found most interesting, and what they felt about the lesson’s structure. I would also have attempted to summarize what was learned and trace back to the learning objectives along with what the teachers felt they had learned from the exercise.

The greatest strengths of this lesson were the multiple ways in which the students and teachers engaged with the content. It was not simply a lecture that exposed the students to the Amistad case and then tested them formally on what they could recall. The teachers created activities that engaged the students deeply with the content. The mock trial was a big motivator in the sense that every student participated in the activity, contributed to the task at hand and, through self investigation, deepened their knowledge about the subject at hand. The teachers were also very attentive to all students and were able to access the students’ ZPD by providing scaffolding and facilitation so that they could reach a level of understanding high enough to create and act out the mock trial.

Particularly, I was pleased to see how the teacher noticed one of the students, who did not speak Spanish, stand up following the cue from her peers, instead of truly understanding the instructions given to do so in Spanish. From the simple fact that the teacher noticed this action, I believe that in some minor way, might have acted as an informal formative assessment. This observation might cause the teachers to reflect and adapt their instructions on this task – they might say explicitly that you should only follow the instructions you undeniably understand. Yet I also feel there was a missed opportunity there for the teacher to ask the girl who stood up, why she stood up, and use that to demonstrate social pressures that lead to involuntary or automatic reflexes while in a community.

I was also pleased to see that there was absolute no mention of a test, grades, common core, or any kind of formal assessment. This shows that the teachers might be aware of reports such as 47th Annual PDK/Gallup Poll of the Public’s Attitudes Toward Public Schools which shows that there is too much emphasis on standardized testing.

Finally, the lesson was an absolute success if we analyze it using the “Identification of Evidence-based Practices” framework (Simonsen, Fairbanks, Briesch, Myers, & Sugai 2008). The use of two teachers instead of one along with a well organized and reasonably sized classroom attended to the needs of maximizing structure. Clear rules were stated, revised, monitored and enforced. One clear example was when the teacher interrupted the mock trial to correct a procedural sequence the students missed. The teachers also offered plenty of Opportunities to Respond (OTR), Class Wide Peer Tutoring (CWPT), along with some Direct Instruction. There was no evidence of the use of Computer Aided Instruction (CAI) nor Guided Notes yet they could have been part of the assigned reading and homework activities. The strategies for acknowledging appropriate behavior were limited to verbal acknowledgements yet were clear and precise. There was no evidence of inappropriate behavior so we also cannot tell if there was any strategy in place to do so. A good sign that the class was well managed and that the students were seriously engaged.

References

Simonsen, B., Fairbanks, S., Briesch, A., Myers, D., & Sugai, G. (2008). Evidence-based practices in classroom management: Considerations for research to practice. Education and Treatment of Children, 31(3), 351-380.

Intro to Teaching – Video Notes

Notes on the video were supposed to watch:

Video notes

Teachers shows pedagogical content knowledge  when evidencing the need to make connections to the student’s funds of knowledge. He is also teaching for higher levels of Bloom’s Taxonomy when eliciting analysis of the subject matter, evaluating its implications and even creating the reenactment of the trial.

1:31 “If you show connections. If you show how history repeats itself. If you show them  how history is still coming around even to the point that it affects you today”

Mr. Fischer – 8th Grade

      1. 2:53
        1. What we are going to study today
      2. 2:59
        1. Objective of the course – narration
      3. 3:10
        1. “Cross examination – we’ve done cases before” previous knowledge
      4. 3:37
        1. Input – lecturing about the facts
      5. 4:00
        1. Showing map – illustrating – situating story
      6. 4:30
        1. Asks questions to students – assessing what the students know
        2. Writes on the board the big ideas – modeling mind map
      7. 5:22
        1. Keeps pulling from students and scaffolding them to get out specific ideas
        2. Summarizes ideas on the board
      8. 6:30
        1. Bad handwriting
      9. 7:04
        1. Todo el mundo deven estar poniendo-se de pied neste momento
        2. Modeling the problem of communication
      10. 7:44
        1. You may sit down – classroom management
      11. 8:17
        1. How did you feel when I spoke Spanish?
        2. Eliciting emotions – affect
      12. 8:55
        1. Planning time – how to illustrate concept
        2. Preplan for when she would do it but let it flow
      13. 9:20
        1. Girl who stud up who did not speak Spanish
        2. Formattve Assessment?
        3. Had impact
      14. 9:33
        1. Reminds students how we are all different and that we have to respect each other’s differences
      15. 9:40
        1. Students prepare for the mock trial in groups
      16. 10:20
        1. They all bring something to the table
      17. 10:32
        1. You decide amongst each others who’s going to do what
      18. 11:00
        1. No leader, everybody was equal
        2. Communities of practice
      19. 11:06
        1. Facilitating / coaching / scaffolding
        2. Provided content for their examination
      20. 12:26
        1. Informal assessment
      21. 12:53
        1. Modeling opening statement
      22. 13:00
        1. I’m not going to do it for you – you know how to do it
        2. Independent practice
      23. Day 2 – 13:21
        1. Cannot teach without allowing the to ask questions as well
      24. 13:28
        1. Homework – independent practice
      25. 14:50
        1. Informal assessment
      26. 15:23
        1. Did not have help to much help from the teachers
        2. Worked independently
        3. Worked together – brainstormed
        4. Went out to do what we had to do
      27. Day 3 – 15:47
      28.  19:33
        1. Student speaks Spanish – transfer from the modeling example the teacher gave in classroom
      29. 23:22
        1. Summarizing what student learned – Bloom’s Taxonomy
      30. 24:10
        1. Debriefing
      31. 24:20
        1. We will continue to study
      32. 24:40
        1. Showing a real example of nowadays
      33. 25:00
        1. Anyone have any questions?
        2. Good job – great class
      34. 25:25
        1. Makes you think a little more
      35. 25:30
        1. Trial got all the information we already new and made it more realistic

 

Intro to Teaching – Final Notes

Notes in preparation to writting final paper:

Learning

ZPD – Distance between actual development level as determined by independent problem solving vs. that through problem solving under adult guidance or in collaboration with more levels of competence.

Transfer – learning in ways that allow us to solve novel problems that we may encounter later.

Metacognition – Knowledge about one’s own cognitive processes, aka one’s own abilities to learn and solve problems. Types of cognitive processes include: attention and fluency, short term memory, storage vs. retrieval, comprehension, motivation and transfer.

Expertise – could be a combination of both content-knowledge and organizational skill and ability to implement and expand.

Instructional Planning

“What will I do to develop effective lessons organized into a cohesive unit?” Marzano, 2007

Action steps:

  1. Identify the focus of a Unit of Instruction
    1. Focus on knowledge that leads towards the goals
    2. Focus on issues that leads towards the goals
    3. Focus on student exploration
  2. Plan for lesson segments that will be routine components of every lesson
    1. Communicate learning goals
    2. Track progress and celebrate success
    3. Establish rules and procedures
  3. Lesson design plan:
    1. Anticipatory set
    2. Objective and purpose
    3. Input
    4. Modeling
    5. Checking for understanding
    6. Guided Practice
    7. Independent Practice
  4. Plan for content specific lesson segments
    1. Help interact with new knowledge
    2. Help practice on the now knowledge
    3. Help generate and test hypothesis about knowledge
    4. Lesson segments devoted to critical – input experience
    5. Lesson segments devoted to practice and deepening of student’s understanding of content
  5. Plan for actions that must be of taken on the spot
    1. Engage students
    2. Rules and procedures – adherence or not
    3. Relationship with students
    4. Communicate with expectations
  6. Develop a flexible draft of daily activities for a unit
  7. Review the critical aspects of effective teaching daily

Bloom’s Taxonomy

  1. Remember – recognizing and recalling facts
  2. Understand – understanding what the facts mean
  3. Apply – Applying the facts, rules, concepts, and ideas
  4. Analyze – Breaking down information into component parts
  5. Evaluate – Judging the value of information or ideas
  6. Create – Combining parts to make a new whole

Objectives – behavioral and measurable

Goals – longer term and might no be achievable

Content Knowledge and Pedagogical Content Knowledge

Differentiation

  • Working with ZPD
  • Account for individual differences
  • Adjusting scaffolds to the child
  • Adjusting scaffolds to the subject
  • Using scaffolds to guide students work in classroom
  • Capitalizing on student’s developmental interests

Readiness

  • No use to step over child’s natural evolutionary steps
  • Supporting social and emotional development
  • Supporting identity development
  • Cultural contexts and development
  • Learning diverse cultural contexts
  • School as a cultural context

Assessment

  • Formative assessment – learn from students and adjust teaching
  • Summative assessment – evaluate goals at the end of a teaching
  • Learning progression
  • Prior knowledge assessments
  • K.W.L.
    • What you Know
    • What the Want to understand
    • Later, what you have learned
  • Rubrics
  • Feedback
  • Assessing for transfer
  • Student self assessment
  • Formal and informal assessment
  • Equity concerns
  • Grades and motivation
  • High and low stakes assessment
  • Looking inside the Black Box

Diversity and Funds of Knowledge

  • Capitalize on school and community resources
  • The culture of power
    • “My kids know how to be Black. You all teach him how to be successful in the White mans’ s world”, Delpit, L., 1995
  • We are currently preparing students for jobs that don’t yet exist, using technologies that haven’t been invented, in order to solve problems we don’t even know are problems yet.” Riley, 2014

Professional Development

  • How to teach teachers teach in a new way

Finals

8 more papers to write until Tuesday!!!

  • Qualitative Research – Mini Project (group work – almost done)
  • Qualitative Research – Reflection Paper (not started)
  • Learning Environments – Final Paper (group work – almost done)
  • Intro to Teaching – Final Analysis (watched video – write paper)
  • Tech 4 Learners – Pedagogical Compass (to start)
  • Tech 4 Learners – Reflection on Design Project (to start)
  • Tech 4 Learners – Chose 2: (to start)
    • 3 learning frameworks and why they matter
    • Advice to a future learning tool designe
    • Learning Technology Evaluation
    • OpEd for EdSurge

Wow…