Category Archives: LDT – GSE – Stanford

Curriculum Construction – Week 7 – Curriculum Critique Assignment

Curriculum Critique – tynker.com – Programming 100

Screen Shot 2016-05-11 at 3.36.29 PM

Curriculum Construction – Winter 2016 – Lucas Longo

Tynker is an online platform that aims to teach programming to children from 7 to 14 years of age or grades 1 through 8, through interactive tutorials in creating the logic for video games. Virtual characters guide the student through each challenge, showing step by step what has to be done. These challenges increase in complexity and gradually introduces new concepts and commands available in the environment. The tool is extremely attractive in term of design and flexible enough to attend to the various age groups it is intended to. An interface for teachers is also available to create lessons, register students, and assign the several lessons they offer in their course catalog. Finally, the teachers can track the student’s progress and response to quizzes presented during the challenges.

I analyzed in more detail their first programming course that introduces the basic mechanics of controlling the game such as moving a character on the screen, verifying if a character is touching another object and react accordingly, and the concept of repetition or loops. The student must drag and connect instruction ‘bubbles’ that will create a chain of commands the character will perform once the ‘play’ button is pressed. If the commands are correctly positioned, the goal is reached and the student progresses to the next challenge. At the end of each lesson a short multiple-choice quiz is presented to the student to ensure that some basic concepts and terminology were understood.

Nowhere in the website there is an explicit declaration of what particular curriculum ideology they based their design. The nature of the interactions and affordances provided by the tool show that there is very little presentation of underlying concepts, but the immediate engagement with acting upon, using, and testing their instructions. One might argue that several ideologies are present in the curriculum – which includes the technological tools, the nature of the challenges, and ways of engaging with the concepts. “If you are going to teach a kid to swim, put them in a swimming pool” – Dewey’s constructivist concepts and the project based learning methodology is evident. The challenges also move on a clear learning progression offering the students a very scaffolded continuity of experiences that build upon each other and provide the basic concept that the student will need in subsequent ones.

“From this point of view, the principle of continuity of experience means that every experience both takes up something from those which have gone before and modifies in some way the quality of those which come after.” (Dewey, 1938, p.35)

Cognitive Pluralists would argue that the subject matter itself, programming, taps into one of our innate abilities. “As a conception of knowledge, Cognitive Pluralism argues that one of the human being’s distinctive features is the capacity to create and manipulate symbols.” (Eisner, 1994, p.79) It is also a very practical activity where you learn by doing.

“Its meaning has shifted from a noun to a verb; intelligence for more than a few cognitive psychologists is not merely something you have, but something you do.” (Eisner, 1994, p.81).

Another powerful concept the curriculum exhibits with its video-game based interface and activities, Nodding’s (1992) care framework describes why students might engage with the content.

“We need a scheme that speaks to the existential heart of life – one that draws attention to our passions, attitudes, connections, concerns, and experienced responsibilities.” (Noddings, 1992, p.47).

I believe that we can make a generalization nowadays that children are fascinated by video-games and thus might find it relevant and interesting to design their own games. In the process, they are exposed to the concepts of programming, video game creation, and even design, once they start customizing their characters and game environments.

The implicit assumption of this curriculum is that programming is an important skill to learn for the future. Explicitly, the curriculum matches several Common Core Mathematics, Common Core ELA, and CSTA Computer Science standards that students develop in each lesson. Clear charts map lessons to Common Core standards by grade. Through their lessons/activities, students are able to learn and explore several Math and English Language concepts. The company has several courses beyond the Programing one I explored that include English, Science, and Social Studies projects. Each course contains several lessons, exercises, and activities the students can complete.

The design of the lessons are such that teachers do not have to a deep knowledge of programming to use it. As their website (tinker.com, 2016) puts it, “Built for Educators. No Experience Required.”  since their “Comprehensive Curriculum” has “Ready-to-use lesson plans and STEM project templates for grades K-8.”  The requirement for using this curriculum is one computer or tablet per child and an internet connection. Even though this might not be a reality in all schools, I believe it is a matter of time that the “1 laptop per child” dream to come true. In any case, the site also provides challenges for students to complete on their own computer or tablet, in the case the school might not provide adequate access. In other words, parents could use this site/curriculum to encourage their children to engage with programming.

The lessons provided are all geared clearly towards and support the intended goals and learning activities. For example, to introduce concepts of angles, the student has to program a spaceship to trace the lines of a star by giving it commands to go forward, turn at a certain angle, and repeat the process again until the star is complete. I felt that an explanation on how to figure out how many degrees each turn should be depending on how many points a star has was missing. The aim of the activity was to introduce the concept of loops and was intended for a second grader, therefore it might have been by design that they left out this explanation. I would have to pay and get access to the more advanced lessons to find out how they introduced such derivations – which apparently are present in the organized course catalog.

The assessment tool presents in a clear manner how students are progressing through the activities. A chart indicates the learning outcomes per student with icons indicating their speed and accuracy in completing each task. I was not able to have access to actual assessment tool which made me curious about how detailed these assessment results are. I would like to know if the teacher could see which exact questions the students got wrong. My guess is that it does provide it simply because of how carefully and throughly thought through the tool is implemented.

One unintended consequence I foresee in implementing this tool in the classroom is that younger students might get distracted with the character customization capabilities the tool offers. I’ve seen this happen firsthand when teaching using Scratch, a similar tool which this one bases it’s block programing style. Students end up spending a significant time playing with the color of the hair, clothes, and other such customizations instead of attending to the task at hand. The tool though, cleverly limits what the student can do in each exercise and establishing a time limit when customizing the characters. Only in more advanced lessons can the students more freely engage with all the features the tool offers.

If we apply Wiggins & McTighe’s (2005) WHERETO evaluation of learning activities, one could say that the tool attends to them all:

W = help the students know Where the unit is going and What is expected.

The lessons have clear goals and measures of success.

H = Hook all students and Hold the interests

I was actually entertained by the challenges presented and attracted by the design of the scenarios and characters.

E = Equip students, help them Experience the key ideas and Explore the issues

The interactive nature of the challenges do provide a rich tool to attend to these criteria.

R = Provide opportunities to Rethink and Revise the understandings and work

The challenges themselves provide opportunities to redo and revise their programs in order to achieve their goals.

E = Allow students to Evaluate their work and its implications

The evaluation of the work comes directly from attaining the goals therefore the feedback is immediate. The quizzes also provide an opportunity for the students to evaluate what they have learned and reflect upon them, even if on their own.

T = Be Tailored (personalized) to the different needs, interests, and abilities of learners

The different lessons attend to different levels and abilities the students might have and the tool is flexible enough to allow students to go as far as they wish with their programming explorations.

In conclusion, I was very impressed with the quantity, quality, and breadth of the tool. The introductory lessons are easy enough and scaffolded enough for the ages it is intended to. There is a high ceiling as well in the sense that you can move from block-based programming to actual programing in Java and connecting to robots to enhance the tangibility of the learning experience.

“The artistry in pedagogy is partly one of placement – finding the place within the child’s experience that will enable her to stretch intellectually while avoiding tasks so difficult that failure is assured.” (Eisner, 1994, p.70)

References

Dewey, J. (1938/1997). Experience and Education. New York: Simon & Schuster.

Eisner, E. (1994). The Educational Imagination: On the Design and Evaluation of School Programs. (3rd. Edition). New York: MacMillan. pp. 47-86.

Noddings, N. (1992). The Challenge to Care in Schools. New York: Teachers College Press.

Wiggins, G., & McTighe, J. (2005). Understanding By Design. (Expanded 2nd edition) Alexandria, VA: Association for Supervision and Curriculum Development.

LDT Seminar – Master's Project Feedback

Got some good feedback today about my Master’s Project:

  • Look at StickyUnit
  • Focus on a content area
  • Talk to John Willensky – making research public

  • Think about what kind of higher ed

  • Skillshare vs Udemy

  • Qualtrix – heat maps

  • Survey: Not going to be published, confidential, where is it going to be published?Informal pilot testing, not formal research paper.

TO DO:

  1. Create email for Udemy teachers
  2. Create survey for teachers
  3. Send all iOS teachers from Udemy

Survey Questions:

  1. Besides teaching online what is your teaching experience or background ?
  2. How many online courses have you already published?
  3. What is your core motivation to publish your content online?
  4. Did you take the course on creating a course? How useful was it?
  5. What were the main challenges in creating the online course?
  6. What do you feel is missing in the online platform that would help you create a better course?
  7. Did you feel the need to have someone helping you in the process?
  8. If you were to add any features to the content the location tool, what would they be?
  9. If you were to redo your online course what would you do differently?
  10. How did the students’ questions and comments inform you about ways to improve your course?
  11. Any further suggestions or comments?

Beyond Bits and Atoms – Week 6 – Reading Notes

Blikstein, 2015. Computationally Enhanced Toolkits for Children: Historical Review and a Framework for Future Design, Stanford University, USA

The generations of microcontrollers

  • The first generation: Pioneers of physical computing (LEGO/Logo, Braitenberg Bricks, and Programmable Bricks)
  • The second generation: Conquering the World (Crickets,Programmable Bricks, and BASIC Stamp)
  • The Third Generation: Broadening Participation and Accessing New Knowledge Domains (GoGo Board, Phidgets,Wiring,andArduino)
  • The Fourth Generation: New form factors, new architectures, and new industrial design (Pico Cricket, Lilypad, Topobo, Cubelets, LittleBits)
  • The Fifth Generation: Single board computers (RaspberryPi, PCDuino, BeagleBoard)

Screen Shot 2016-02-11 at 10.38.31 AM.png

  • Selective exposure for usability: Embedded error correction
  • Selective exposure for power: Tangibility mapping
  • “The main construct proposed in this monograph (selective exposure) and its two subcategories (embedded error correction and tangibility mapping) could help understand the use of current products and give designers a framework to imagine new ones.” (Blikstein, 2015)

Horn, M. S. (2013, February). The role of cultural forms in tangible interaction design. InProceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction(pp. 117-124). ACM.

In this paper I have proposed an approach to tangible interaction design that looks beyond physical analogies and universal sensorimotor experiences. Specifically, I have argued that designers can purposefully evoke cultural forms as a means to activate existing patterns of social activity along with associated cognitive, physical, and emotional resources. This approach to design was inspired by the notion of social and cultural funds of knowledge [8, 18] and by Saxe’s form-function shift framework [30, 31]. Using three examples I demonstrated what this might look like in action.”

 

Curriculum Construction – Week 6 – Reading Notes

Banks, J. (1993). The Canon Debate, Knowledge Construction, and Multicultural Education. Educational Researcher, 22(5), pp. 4-14.

  • Dominating groups
    • Western traditionalists
    • Multiculturalists
    • Afrocentrism
  • Polarized debate, primarily in popular press, no productive interactions
  • Positionality – started with feminist movement
    • “Positionality reveals the importance of identifying the positions and frames of reference from which scholars and writers present their data, interpretations, analyses, and instruction (Anzaldúa, 1990; Ellsworth, 1989).”, (Banks, 1993, p. 5)
  • Five types of knowledge
    • Personal/cultural knowledge
    • Popular knowledge
    • Mainstream academic knowledge
    • School knowledge
  • The rules of power
    • “Delpit (1988) has stated that African American students are often unfamiliar with school cultural knowledge regarding power relationships. They consequently experience academic and behavioral problems because of their failure to conform to established norms, rules, and expectations. She recommends that teachers help African American students learn the rules of power in the school culture by explicitly teaching them to the students.” (Banks, 1993, p.7)
  • From academia to the classroom – takes time
    • “Consequently, school knowledge is influenced most heavily by mainstream academic knowledge and popular knowledge. Transformative academic knowledge usually has little direct influence on school knowledge. It usually affects school knowledge in a significant way only after it has become a part of mainstream and popular knowledge.” (Banks, 1993, p.11)

Sleeter, C. (1996). Multicultural Education as Social Activism. Albany, New York: State University of New York Press. pp. 91- 115.

  • Multiculturalism as a form of dialogue and acceptance of several points of view
  • Curricula often attempt to include/induce minorities into the dominant’s culture
  • “Oppressors” say that all the differences have been ‘resolved’ in order to maintain status quo
  • Move away from trying to integrate towards discussing and understanding the different

Eisner, E. W. (1993). Forms of understanding and the future of educational research. Educational researcher, 22(7), pp. 5-11.

  • Representations of meaning
    • “Representation, as I use the term, is not the mental representation discussed in cognitive science (Shepard, 1982,1990)but, rather,the process of transforming the contents of consciousness into a public form so that they can be stabilized, inspected, edited, and shared with others.” (Eisner, 1993, p.6)
  • New forms for new understandings – but how to assess these multiple forms that go beyond text and numbers?
  • Must explore
    • “Working at the edge of incompetence takes courage.” (Eisner, 1993, p.10)

Gardner, H. (1999). The Disciplined Mind. New York: Penguin Books. pp. 186-201, 208-213.

  • Enhance understanding by:
    • Providing powerful points of entry
      • Narrative entry points
      • Numerical entry points
      • Logical entry points
      • Existential/foundational entry points
      • Aesthetic entry points
      • “Hands-on” points of entry
      • Interpersonal points of entry
    • Offering apt analogies
      • Powerful analogies and metaphors
    • Providing multiple representations of the central or core ideas of the topic
  • Issues
    • How does one orchestrate the three approaches to important ideas?
    • How does one spread this orientation to the rest of the curriculum – and with might the limitations be?
    • How does one assess the success of such an approach?
    • How might this approach be misunderstood?
    • In the end, what is the status of the true, the beautiful, and the good, and of their possible interconnections?
  • Possibilites and limits
    • Mensures of success
    • Possible misunderstandings of the approach
    • Once more: the true, the beautiful, and the good

Curriculum Construction – Week 6 – Reading Reflection 2 Assignment

“Whether it comes after teaching, while teaching, or by teaching, we often think of assessment as something done to students, not with them.” (Coffey, 2003, p.76)

The word “assessment” is often thought of as the final grade on the report card or as standardized tests that simply rank or classify the student. What it should be thought of is an opportunity for learning and an integral part of classroom activities. In an evolved, mature, and structured teacher-student dynamic, students can create their own quizzes or exam questions, engage in reflections upon their peers presentations or projects, and even grade each other’s tests. The idea might sound radical yet the benefits might outweigh the extra work and planning it might take to ‘flip’ assessment. Being able to understand what quality work is, analyze your own work, receive feedback and act upon it, is a valuable life-long skill.

“When students play a key role in the assessment process they acquire the tools they need to take responsibility for their own learning.” (Coffey, 2003, p.77)

One of the main purpose of assessment is for external accountability but it’s best application might be to improve student motivation, curiosity for learning, and to improve the teacher’s efficacy. It is primordial for students to understand the purpose of their own education and to feel responsible for it. Exposing the students to how they will be assessed and what the enduring understandings the courses will bring to them gives them a sense of purpose of their education. Not knowing why you should learn math or science transforms the whole learning experience meaningless. This disconnect is minimized when the teacher starts by involving the students in creating measures and activities that will demonstrate their understanding of what the course is all about.

“Through the students’ explicit participation in all aspects of assessment activity, they arrived at shared meaning of quality work. Teachers and students used assessment to construct the bigger picture of an area of study, concept, or subject mater area.” (Coffey, 2003, p.78)

To apply this in a classroom requires a significant change in teaching practice. It might feel that engaging the students in everyday assessment practices takes precious time out of regular ‘content-coverage’. Yet this very engagement creates for the students, connections between content and demonstration of knowledge, between their own work and what quality work looks like. It might even make the teacher’s work easier in the sense that the students create their own tests and even grade their own work. They also provide feedback to their peers during presentations and in the process are learning by engaging with the material.  Buy-in from school administrators also should be easy since you can actually use traditional assessments in this process. The topic of large scale assessment in itself is an interesting topic students should be aware of, understand the reasons why they exist, and reduce the stress involved in test taking; but I digress.

So how might you apply this concept in practice? Coffey’s concept of “everyday assessment” fits in well with Wiggins and McTighe’s (2005) Understanding by Design process. The main twist or difference would be that the process of determining acceptable evidence of student understanding would not be done in isolation, but with the students. The teacher would obviously have to provide the course’s goals and essential questions but will do so in a manner that students understand why it is important to their lives (present and future) and how will they know if they actually learned the content.

“Despite initial resistance, as students learned assessment-related skills, demarcations between roles and responsibilities with respect to assessment blurred. They learned to take on responsibilities and many even appropriated ongoing assessment into their regular habits and repertoires.” (Coffey, 2003, p.86)

The process is not easy and it takes time but it provides a sense of clarity for the teacher when planning a course or a lesson. It’s not about the content that has to be delivered, it’s about creating mechanisms that demonstrate student’s learning. It’s about reviving that child’s desire to show-off to their parents what they have just accomplished. It’s about knowing what your parents expect of you and creating a relationship that is based upon growth.

References

Coffey, J. (2003). Involving Students in Assessment. In J. Atkin & J. Coffey (Eds.) Everyday Assessment in the Science Classroom. Arlington, VA: National Science Teachers Association. pp. 75-87.

Wiggins, G., & McTighe, J. (2005). Understanding By Design. (Expanded 2nd edition) Alexandria, VA: Association for Supervision and Curriculum Development. pp. 13-34, and 105-133.

Brazilian Education – Week 6 – Class Notes

Robert-Evan-Verhine.jpg

Today we talked about how education is financed in Brazil and the National Education Plan lead by Bob Verhine (Universidade Federal da Bahia).

Basically Brazil spends quite a bit in education in absolute terms but per-student spending is ridiculously low.

The plan has 20 goals which are for the most part unatainable, vague, and/or non-descriptive. The focus seems to be to please all sides, be neutral, and avoid controversy.

Many policies and laws established in our Constitution for example, have never been put into practice…

Here are some points about the challenges to increase expenditures in brazilian education.

Teacher PD – Week 6 – Class Notes

IMG_1973.JPG

Jigsaw exercise

Science Readings – Group 2
Audience – Teachers

Conceptual framework

  • STeLLA Conceptual Framework
    • Student Thinking
    • Science Content Storyline
  • Deepen content knowledge + STeLLA framework
  • Theory of teacher learning
    • “Theory of teacher learning. The program design was guided by a situated cognition theory of teacher learning and a cognitive apprenticeship model of instruction that view learning as naturally tied to authentic activity, context, and culture (Brown, Collins, & Duguid, 1989; Lave, 1988).”
  • Theory of science teaching and learning
    • “Theory of science teaching and learning. The STeLLA program is based on a constructivist, conceptual change view of science teaching and learning that focuses on making student thinking visible so that instruction can be responsive to students’ emerging ideas, understandings, and confusions and thus support them in developing understandings of science concepts and phenomena that genuinely make sense to them (not just memorized words) “

With regard to our primary research question about program effects, we observed that students whose teachers experienced content deepening integrated with analysis-of-practice in their professional development program (i.e., STeLLA) reached higher levels of science achievement than did students whose teachers received content deepening alone.”

Control group was called comparison group

How did it influence:

  • Research question
    • Content embedded within practice
  • PD Design
    • Center around videos and artifacts
    • Inquiry based
    • Social constructivist where the group comes up with explanation
    • Link science content & pedagogy
  • Research Design

Most important to share?

  • How to do it

What doesn’t matter?

  • Statistics

*****************

PD for teacher

Math

  • Grounded on the content of the PD – algebraic reasoning
  • Student thinking
  • Addressing content knowledge
  • Focus on one key idea
  • On site support

IMG_1975.JPG

LDT Seminar – Week 5 – Milestones Assignemnt

Here’s what I have so far in the definition of my Master’s Project – work-in-progress.

Learning Experience Designer (LXD)
Lucas Longo
v.0, Jan 14, 2016

ABSTRACT

The trend towards blended learning environments is irreversible and an increasing number of higher educational institutions are going in that direction. It is a labor intensive task for professors who must transition from a traditional classroom or lecture hall model to an online environment. Aside from the learning curve into any LMS, new content must be created and organized: pdfs, images, videos, links, animations to list a few. The challenge is to make it easier for professors who for the most part do not have formal pedagogical training or multimedia content creation skills, to publish their courses adopting the research based best-practices.

Learning Experience Designer (LXD) is a curriculum construction tool that adapts to your teaching context and learner needs. It also provides all the multimedia creation tools you might need to record and edit video, annotate images and pdf, or create animations. It utilizes artificial intelligence to suggest course formats, pedagogical strategies, activities, and challenges providing references to works others have already created and tested. The final result is a published course which can be accessed via your browser or a mobile app where students can engage in forums and peer-to-peer coaching.

As a proof-of-concept, I propose to utilize as a base, an existing LMS (Canvas, Coursera, or Edx) and add onto its interface the proposed functions, content, and interactions. These new features will then be presented and evaluated by teachers who have experience with the LMS. The goals are to judge if such features improve the experience of creating the course and if the resulting course positively affects the learning outcomes. I intend to focus an introductory programming course, a subject matter I am familiar with, where the learning outcomes are more easily assessed, and because of the vast amount of content already available online to support the course.

MASTER’S PROJECT PROPOSAL

CHALLENGE

Needs

How might we scaffold “experts” to create engaging hybrid courses?

In 2009 I started a mobile app development school in Brazil targeting developers and designers who needed to acquire these new hot new skills. For the first year or so I taught the iPhone app development course while looking for more teachers to meet the large demand and to create new courses. Pedagogically, I going on instincts, using a very hands-on approach: explain the concept, model it, and do it yourself. It worked and it was straight forward enough to explain to the new teachers.

The challenge came when I started hiring teachers for new courses. The curriculum had to be constructed and the course content created. This task proved to be daunting for the developers who never taught before. Even with my course material as a reference or model, teachers were slow to produce the material, and it was usually of poor quality: slides with too many details or lacking explanations of key concepts.

Once I decided I wanted to start selling the courses online, the challenge became too big. Where do I start? How much video versus written material should I use? How will students ask questions? How will we manage all these students? What are the best practices? All questions that could be resolved by a well designed software that would scaffold the process of creating the curriculum and course content.

LEARNING

Benefits:

LDX will make the user more proficient in the art of sharing their knowledge, stimulating them to repeat the process and create new and better courses. Users will benefit from theory grounded strategies that promote effective learning in online environments. The Virtual Student will lead the process by posing provocative questions and requesting content, assessment, and reflection activities to be inserted into the course progression. The backwards design methodology will be utilized along with the best practices of teaching online.

The main learning outcome will be that online teaching requires a different set of approaches, content, media, interactive experiences, and assessment methods to be effective. The Virtual Student will serve as an instructor and coach for the user during the process – different than a standalone course on ‘how to teach online’. Teaching and learning will occur during the process of creating a course.

To demonstrate the effectiveness of LDX I propose to survey the users pre- and post-utilization of the tool with questions that will inform me of the following characteristics of the user:

Pre-utilization:

  • Digital literacy level
  • Previous knowledge/experience with pedagogy
  • Previous knowledge/experience with online teaching
  • Perception of online course effectiveness
  • Personal beliefs on the challenges of creating an online course
  • Confidence level for creating an online course

Post-utilization:

  • New pedagogical content acquired
  • New online teaching content acquired
  • Perception of online course effectiveness
  • Personal beliefs on the challenges of creating an online course
  • Confidence level for creating an online course

During the utilization of the tool I intend to collect the following data:

  • Webcam video recording
  • Screen video recording w/ mouse tracks and clicks
  • User will be asked to think-aloud throughout the process

I also intend to test LDX with users who have already created online courses and interview them to get the following:

  • Perception of how much LDX actually helped them in the process
  • What would they do different now if they were to redo their existing courses
  • Input and feedback on what worked, what didn’t work, and suggestions

The results will be interpreted using the grounded theory qualitative research method. Theories of how to improve the tool will emerge from the evidence coding and proposition creation. The conclusion will address issues such as the viability of the concept, effectiveness, and suggestions for future improvement.

Approach:

The approach to learning that informs my design is a combination of the Protege Effect, project-based learning, and TPACK. The expert, teacher, or content creator is here called the “user” insofar as it is the person who is interacting with LXD during the course creation process.

The Protege Effect will be elicited through a virtual student who will prompt the user to teach him by asking leading questions, making suggestions, and warning the user about excessive use of one style of teaching as well as the lack of content, reflection opportunities, or detailing of previous knowledge. The virtual-student closes the gap between the content ideation and the actual student’s experience. Through immediate feedback, the virtual student will elicit the user to think deeply about content choices and aid in the process of deciding the learning progression that must be in place.

The project-based learning approach simply entails that the user is engaged in a project while using LXD itself. The project is the course creation process itself, within which scaffolds are presented to the user. In addition, the approach utilizes backwards design principles embedded in the interactions the virtual student has with the user. The idea being that the heuristics and strategies invoked by the virtual student are guided by these approaches without necessarily making them explicit.

Finally, LXD aims at increasing the user’s Technological Pedagogical Content Knowledge (TPACK) by offering simple media editing tools, pedagogical scaffolds, as well as content produced by other users that can be incorporated into the course creation process. LXD in itself is a technological tool that will increase the user’s TPACK by presenting the necessary information, background knowledge, and content that supports the user’s ideation and publication strategies.

DESIGN OF THE LEARNING EXPERIENCE

Existing solutions (“competition”):

LXD is a construct that for the purposes of this project, will build upon an existing LMS or CMS – let’s call it LCMS for simplicity. This LCMS provides a base from which to start off with. Creating an entirely new LCMS from scratch is unfeasible and not necessary in order to test LXD’s effectiveness. I am currently analyzing which platform will be the best suited for this project. Here’s a list of the ones I have shortlisted:

  • Moodle – open-source CMS
    • Positives
      • Total freedom to create
      • Mature platform with thousands of plugins
      • Large community to interact with
    • Negatives
      • Cumbersome to customize
      • Old looking base interface
      • Old HTML base – no use of HTML5 affordances
  • Coursera
    • Positives
      • Could work with existing content publishers on Stanford
    • Negatives
      • The focus is curating online courses done with any online tool (I think)
  • Udemy
    • Positives
      • Content publishing tools is one of the most user friendly I’ve seen
    • Negatives
      • Would have to negotiate with Udemy access to their platform’s source code
  • Udacity
    • Positive
      • Focused on tech courses – familiar to me
    • Negative
      • Have never seen their course publication tool
      • Would have to negotiate with Udacity access to their platform’s source code
  • EdX
    • Positive
      • Candace Thille might have contacts to get access to the company
    • Negative
      • Have never seen their course publication tool
      • Would have to negotiate with EdX access to their platform’s source code

At the moment, Udemy is looking like my favorite candidate.

I also intend to talk to VPTL at Stanford to understand what are the usual difficulties professor have in the process of creating their online tools.

Approach:

LDX will be a web-based tool which will overlay the existing LCMS with text, image, and video triggered by analyzing the steps and content being published in the course. This is where artificial intelligence comes to play. Let’s say that the user has published a 30 minute video – LDX might suggest that the video should be shorter. If the user publishes 50 pages of text with no images, LDX might suggest that images illustrate concepts more powerfully that text alone. LDX might prompt the user to insert a knowledge-check or reflection activity once the user has published 5 pieces of content. The idea is to provoke the user to think about how the learner will be processing the content towards learning.  

The key features of LDX are:

  • Virtual Student
    • 3D character that talks to the user
    • Guides the user through the process of creating the content
    • Asks questions about the content and format of the course as it is created
  • Media Editing Suite
    • Video editor
    • Image editor
    • Text editor
  • Course Publication Tool
    • Create course structure and progression
    • Add media
    • Create assessments (quizzes, multiple choice, reflections, and etc)
  • Curated Content
    • Access to similar courses to get examples
    • Ability to link to external material for student’s reference
  • Coaching
    • Peer-to-peer help to go through courses

This is definitely ambitious for the time and resources I have for this project. The biggest challenge, other than the sheer volume of features, is the Virtual Student. I would have to partner up with someone who has experience and access to such technology in order to create a prototype.

EVIDENCE OF SUCCESS

Benefits: How will you show that what you’re doing is 1) working (usability studies) and 2) helping learners learn (learning assessment)?  What kinds of evidence do you plan to collect (summarize here; put survey items, think-aloud protocol, transfer problems, etc. in an appendix)? Will you use questions/frameworks from published studies, or create your own? What target learners will you interview/observe, where and under what circumstances? How do you expect to analyze and interpret your findings?

SUMMARY AND NEXT STEPS

What were the major ideas  in this project? What are you most looking forward to?

REFERENCES

(length as needed)

This proposal should use exisiting scholarship to justify, explain, and extend what you propose to do.  In addition to being mentioned in the text, this list of sources that are cited in this proposal provides instructions for how to find the original reference, in case the reader wants to know more about what that author said. Please use one standard format, such as for example APA: http://owl.english.purdue.edu/owl/resource/560/02/.

 

APPENDIX A: TIME

(1 page max.)

 

Milestones and deliverables

When do you need to do what, in order to finish on time? Example:

Winter quarter Observe target learners

Develop ideas

Write proposal

March 20, 2015 Proposal draft submitted to advisor
Date Participants for user testing and learning assessment arranged
Date Low-res learning assessments complete
Date Low-res prototype studies complete
Date Round 2 learning assessments complete
Date Round 2 prototype design complete
Date Final user testing and learning assessment complete
July 20, 2015 Project logo and video submitted
July 31, 2015 EXPO presentation, demo
August 6, 2015 Draft report submitted
August 13, 2015 Signed Master’s Project Form submitted

Time needed to implement project

Include your time and time of others. Okay to barter and trade skills with each other. You can learn new skills, but include the time it takes.

 

APPENDIX B: MONEY

(1 page max.)

 

Funds needed to implement project

If you had a small budget to spend, what would you want to use it for? Think about thank-you gifts for testers, consultants, software, supplies. Be creative in thinking about how to leverage limited funds.

 

Every project that submits a budget here will be given a stipend of up to $200 per student to cover those needs.  Budgets should be specific enough to show where the funds go, but need not itemize every More funds may be available (apply and wait for approval before spending the funds!).

 

Item Approximate Cost
$
Total:

 

APPENDIX C: PEOPLE

(1 page max.)

 

Collaboration (For Team Projects)

Explanation of how you intend to work with others on the team. How will the work and responsibilities be shared? How will individuals’ contributions be incorporated into the group product?

 

Supporters

List of cooperating contributors to your project. These are people outside the project team whom you expect to consult with or who will provide support for the project. What tasks will you need help with (e.g., coding, graphics, connecting with target learners)?  In a real-world proposal they would write letters of support. Here, just list names & their contributions.

 

For individual projects, how will you set up your environment to give you the feedback and support you need along the way?

 

APPENDIX D: OTHER (Optional)

(length as needed)

 

Anything else you think you need to share with your reader. Do not assume appendices will be read; these are reference materials that provide the opportunity for the reader to go deeper should she or he so desire. Summarize the message or insights gleaned from these materials in the text of your proposal.  Example: annotated list of competing products.