Motivation and Screencasts

May 22, 2014 • Greg Wilson

Our next exercise has two parts.

Part 1: Motivational Videos

I would like you to create a three-minute screencast to motivate people to want to learn a topic of your choice. Our goals are:

  1. To learn how to do video teaching, which is different from teaching live or making written materials.
  2. To get more practice giving feedback.
  3. To think about how to get people excited about what you’re about to teach.

To begin, think of something you’d like to teach someone in your field—some tool or process that would take between half an hour and an hour to get across.

Second, you’ll need some software. If you’re using Windows or Mac OS X, I suggest you download a trial version of Camtasia; it’ll take you about an hour to install it and learn how to drive it. (If you don’t want to do that, tools like QuickTime actually have a screen recording mode.) Tooling is a bit more complex on Linux: people have had good luck with Screencast-o-matic, but there’s some buzz now about SimpleScreenRecorder and Freeseer as well.

Third, do a couple of practice runs, but no more than a couple—in the past, some people have spent several hours trying to make their video perfect, but that’s not what we’re after:

  1. We want to see and hear what you’d actually do and say in front of an audience if you were teaching live.
  2. In real life, you’ll never have four hours to make a three-minute video anyway (not unless you have a Hollywood budget).

Once you’ve recorded your video, post it on YouTube, Vimeo, or some other sharing site, and then blog a short description of what you’ve taught and a link to the video. Please try to get this up by Wednesday, May 28, so that everyone has a few days to watch and comment on at least two videos before our meeting on June 4.

If you’ve never created a screencast before (or even if you have), you may enjoy reading this guide. Links to screencasts created by previous instructor trainees can be found here, and as always, questions are very welcome.

Part 2: Demotivation and How to Avoid It

In the second part of this exercise, I want you to write a paragraph describing a specific episode in which something or someone demotivated you when you were learning, and another paragraph in which you explain what specific things could have been done before, during, or after that incident to prevent or fix things. As I said in our meeting yesterday, taking away demotivators is as important to teaching as doing things that are motivating: if we can avoid turning learners off, we’re a lot further ahead than we were.

These two paragraphs should be added to your video blog post before our June 4 meeting, but don’t have to be up next week. As always, if ou have questions, please mail me directly or ask on the mailing list.


Group 9 / May 21, 2014

Notes

  • Trick: include open-ended questions to fish for plausible wrong answers
    • If you get one misconception a lot — you can correct it in class
    • If you get as many as there are students, it’s a more complex subject than you thought
  • Jon Duncan: Timothy- great point. Going from abstract concept maps to specific questions was the challenge here to me
  • Scott Burns: the most difficult part for me was making questions that *only* covered the concept map
  • Class is split on which was more useful — concept maps or assessment questions
  • Use the one you find more useful, and if you’re cooperating with someone who already did done — do the other. It’s more likely to uncover new perspectives.
  • Students learn things with different patterns and MCQs more attentative to the students needs.  Concept maps better at helping instructor — HT hsingtzu
  • Greg likes using concept maps when working face to face (i.e. can draw on a white board).  Assessment and MCQ and the specific text associated with that works better when working remotely, via version control. etc.
  • Scaffolding — many people, when given a blank sheet of paper, are, afraid. Giving them a “fill in the blanks” skeleton solves that, and also helps checking their answers quickly.
  • When there is a case to be made for each option in an MCQ, you can ask the student to rank the options.
  • If everyone answers correctly before you start — just skip to the next lecture. If everyone stares blankly, teach. What if it’s half and half?
    • You can say “everyone who got it right, grab the closest person who doesn’t and explain to them”
  • Greg:
    • Assessment questions can actually be designed to teach a student something new by challenging them to apply the learned skill in a new context.
  • Alex:
    • I feel that experts would laugh at the simplicity of some of my questions
    • Greg: questions can also be used to assess knowledge prior to teaching a lesson. If all your students get the correct answers and understand the topic, you may be able to skip the lesson and more on to more advanced topics.
      • Peer-to-peer teaching, while hard to make a lot of time for in a 2 day workshop, is valuable so that students get direct face-to-face feedback. Can make time so that students try P2P teaching so that they are more likely to do so in their own practice.
      • Be careful that students who do understand the topic don’t feel that you are taking advantage of them by asking students to help teach (and learn) from each other.
  • Greg
    • How do you know if you have a good assessment Q?
      • You don’t! Its common to include some questions on a test, but not intend to grade them, simply to see what the spread of answers are to gather possible MQC answers for the future.
      • If the spread is too wide, the question is likely too ambiguous
      • Essential that students don’t know which Qs are being graded to get full effect.
    • Need to check to see how many topics each question is testing
      • If a q tests more than one, the Q may be more difficult to answer correctly.
  • Jacob
    • Found out through testing my Q that a feature i thought was included (tab completion) is not actually installed by default!
      • Question would have fallen flat in the real world.
      • I wanted to use tab completion to apply consistent formatting to the programming responses so that they could be more quickly assessed.
  • Greg
    • In programming lessons, because there are often more than one correct way of answering/solving the problem, it may be better to provide most of the code, but ask students to fill in the blanks for the correct solution.
  • MCQs that ask students to order answers rather than pick one, is a good way to assess judgement, that is easy to assess
    • may not work with all questions, but is another tool for designing good questions

Questions for today:

  • How long did it take you to create?
  • Looking at responses/reviews to your questions, how well phrased (unambiguous) was your question?
  • How accurately do you think it would assess learners’ knowledge *before* a   lesson? (Think about the answers other people gave you…)
  • How accurately do you think it would assess their knowledge *after* the lesson? (If the answer is different than the one above, why?)
  • Do you think you have a clearer idea about how to teach your topic now than you did before you wrote your question?  If so, how did it help?

19:00 Eastern

  • Greg Wilson (Mozilla, Toronto): no assessment URL
  • Devasena Inupakutika (University of Southampton, SSI, UK): http://teaching.software-carpentry.org/2014/05/19/assessment-for-while-loop-concept-map/
    • Around 15 minutes.
    • I think the concept map I chose was simple and straight-forward so, it was ok. However, I was not sure of which language to consider while framing questions as I’m not sure which language or technology students will be comfortable with.
    • It was ok, I believe. But prior knowledge on “while” (in this case-any language) is required.
    • Pretty good. As the concept map would help learners relate the actual meaning of “while” in English language (i.e. if they don’t have prior programming concept of “while”) to the one in map.
    • Yes, I now have a clear idea on how to teach a topic. Concept map is a good approach to start with and helped me in explaining a topic.
  • Jeremiah Lant (Louisville, Kentucky, U.S.): http://teaching.software-carpentry.org/2014/05/14/assessment-questions-for-regular-expressions/
    • The assessment questions took around 20 minutes to create.
    • The assessment questions were phrased fairly well. Maybe not as clear as I would have liked.
    • The assessment questions would assess one’s knowledge pretty well before the lesson because I tried to stress the purpose of using regular expressions.
    • The assessment questions would assess one’s knowledge better after the lesson because I tried to stress the how regular expressions could be applied and there purpose.
    • Yes, I do have a little clearer idea on how to teach the topic better now than before.  It helped me learn how be more specific on asking the question and address the main idea I want to get across.
  • Dan Warren (Macquarie University, Sydney, Australia): http://teaching.software-carpentry.org/2014/05/15/assessment-for-regular-expressions/
    • About ten minutes
    • The second one was evidently not as clear as I’d intenteded
    • Decently well, I think.  If you had no exposure to regex before, I think the questions would have been completely baffling.
    • About the same, but hopefully the lesson would be good enough that you could answer them.
    • Yes, it made it very clear how specific you have to be.  There were clearly different ways to read my second question.  While I would have graded several different answers as “correct”, I could make it easier on myself by setting up the question so that there are fewer correct ways to answer it.
  • Jonathan Frederic (CalPoly San Luis Obispo, California, U.S.): http://teaching.software-carpentry.org/2014/05/21/assessment-for-list-comprehensions/
    • 15 minutes.  It took a while to consider the possible response that the students would give and whether or not they would be useful.  Additionally, it was hard to write a Q&A type assessment without giving away answers to previous questions.
    • Because my assignment was late, I have not received any responses yet.
    • My only fear is that I may have underestimated their knowledge (actual students that is, I’m sure what I wrote was overly simplistic for most/everyone here).  It may be easy enough to answer without actually learning anything from my talk.
    • Once again, my questions may be too simple.  I’m not sure that if the students answer them correctly I can be sure it proves anything more than that they were paying attention.
    • Yes/no.  I think the concept map is exactly how I would start to explain my subject.  Questions in my Q&A would guide what subjects need emphasis.
  • Tim McNamara (NeSI, New Zealand) http://teaching.software-carpentry.org/?p=7272
    • Lots of thinking (well-meaning procrastination) over the weekend, but drafting my questions took about 10-15mins once I finally got down to it. Very challenging to think of “good enough” answers.
    • I wonder if one of my questions assessed syntax, rather than understanding. Also slightly concerned that with my other one, knowledge of the definitions (mutability) was being assessed rather than the concepts.
    • Pretty well. The syntactical issues would be less of a concern, as the exposure would have happened. Definitions would have also been explored.
    • Yes. Although I was, in a sense, assessing against the concept map presented to me — rather than how I may have introduced the topic.
  • Matthew Dimmock (Monash Uni, Australia) http://teaching.software-carpentry.org/2014/05/14/formative-questions-on-concept-map-by-tim-mcnamara/
    • 20 mins or so.
    • The MCQ was probably a little too easy and didn’t challenge the user enough.  The technical question was actually on stack overflow, which I should have checked first.
    • I think it the questions would have assessed knowledge before and after.
    • I think I now know how to ask an MCQ in a more precise manner.
  • Yu-Ching Shih (National Taiwan University, Taipei, Taiwan) http://teaching.software-carpentry.org/2014/05/20/assessment-question-for-vim
    • 20 min.
    • My MCQ might not be clear since two answers only choose one.
    • They are pretty well.
    • YES. Concept map helps a lot how to teach a topic. Assessment questions let me know where to emphasis.

14:00 Eastern

  • Greg Wilson (Mozilla, Toronto): no assessment URL
  • Dav Clark (UC Berkeley) http://teaching.software-carpentry.org/2014/05/20/formative-questions-for-github-actually-generally-about-git/
    • 15 minutes
    • I’m very happy with how it worked out — I think it was unambiguous and revealed gaps in knowledge
    • Well
    • Less well — if someone knows a few terms after I just mentioned them, they are more likely to get the second question correct by rote (and perhaps the first)
    • Yes — I am a bit more confident in assessing student knowledge
  • Padraic Stack (Dublin, Ireland): http://teaching.software-carpentry.org/2014/05/13/assessment-for-devasena-inupakutikss-shell-paths/
    • Q1. About ten to fifteen minutes (i think).
    • Q2.  I didn’t get any answers — I think my questions would have seemed quite basic to many of this course’s participants. That said, I think the questions were clear and unambigious.
    • Q3. I think my questions would let me know if someone was an absolute beginner or not — they wouldn’t serve to ‘grade’ people’s levels of knowledge.
    • Q4. I think it would work pretty well for checking if people’s took in the lesson content — which would be for a beginner.
    • Q5. Yes and yes- actually what was helpful was seeing someone else’s questions for the same lesson / concept map.
  • Alex Simperler (Imperial College London, UK):http://teaching.software-carpentry.org/2014/05/09/assessment-to-graham-etheringtons-while-concept-map/
    • Q1 10 minutes — needed to find old slides about perl
    • Q2 was good
    • Q3 I would have recognized them as programmers
    • Q4 would have made experts laugh
    • Q5 I usually teach beginners so would be fine — I actually learned some new ways of doing stuff
  • Bror Jonsson (Princeton University, currently Cape Town): http://teaching.software-carpentry.org/2014/05/14/assessment-question-for-padraic-stacks-setting-up-git/
    • About 20 mins?
    • Seemed OK.
    • Assessing before class can be problematic because the students can feel stressed
    • Much better- the students have a reasonable chance to know the answers.
    • I do. I normally ask questions when teaching orally, but written questions could give an extra level of information.
  • Michael Schliephake (KTH, Sweden): http://teaching.software-carpentry.org/2014/05/07/mcq-about-shell-pathes/
    • 30 mins
    • could be improved from hints
    • questions test knowledge about topic, so should be suited for diagnosis
    • questions test knowledge about topic, so should be suited for diagnosis
    • The formulation of the questions helped to sharpen the use of terms
  • Russell Alleen-Willems (Diachronic Design, Seattle, WA) — A bit noisy here, but I can still chat so long as my baby stays asleep!: http://teaching.software-carpentry.org/2014/05/14/assessment-questions-for-basic-loop-structures/
    • Not long surprisingly (~30 mins for 1 MCQ, 1 Short Fill in the Blank Q). Once I had the concept map, and the end goal of a set of tasks I wanted students to be able to accomplish, coming up with brief MQC or exercises to test those skills didn’t take very long.
    • The three “students” who answered my questions thought they were fairly clear, and all three answered correctly.
    • I think my questions would do a good job of testing students’ specific knowledge immediately before teaching a skill, and then assessing how well I taught the skill immediately afterwards or on a later test.
    • I think it would assess how well I taught the lesson for that specific situation. In How Learning Works, I know the point was made that sometimes students have trouble using learned skills in unfamiliar situations/contexts.
    • Yes! Formulating specific assessment questions and end goals constrained the scope of the topics feel I would need to cover in a lesson.
  • Chandler Wilkerson (Rice U, Texas) http://teaching.software-carpentry.org/2014/05/19/assessment-for-sql-statements-by-jacob-levernier/
    • Choosing a concept map took longer than creating the assessment, ~15 min
    • I believe my second question was ambiguous, and I hadn’t seen that when I wrote it.
    • As a pre-assessment, it would be reasonable, probably to the scope of the 5 minute concept explanation I imagined from the map
    • It wouldn’t be a complete assessment for the whole concept, but I didn’t want to throw in lots of extra syntax that wasn’t in the map.
    • I believe it helped me gain a better understanding of what a very basic intro to SQL would look like.
  • Catalina Anghel (OICR, Canada) http://teaching.software-carpentry.org/2014/05/21/assessment-for-functions-in-r/, but it’s not very good, so don’t call on me!
    • Over an hour (got distracted with a question that I thought would take 5 min and didn’t)
    • I think it was too long, as people didn’t attempt it (although I was late posting it, too)
    • Not very?
    • Yes, but just testing basic syntax.
    • Yes, at least to know that something that I thought was 5 minutes wasn’t.
  • Stefan Pfenninger (Imperial College London, UK) http://teaching.software-carpentry.org/2014/05/14/assessment-questions-for-list-comprehension/
    • Don’t remember exactly, 15-20min probably
    • Unambigous, all answers correct
    • You would need to understand the topic before being able to answer
    • See above
    • Have to be careful when designing questions that don’t inadvertently end up assessing additional concepts which the learners might not have come across yet (so you end up not actually knowing which of the concepts they didn’t understand if they get it wrong)
  • Jacob Levernier (U. Oregon, USA) http://teaching.software-carpentry.org/2014/05/15/assessment-for-python-dictionaries-2/, http://teaching.software-carpentry.org/2014/05/15/assessment-on-shell-paths/
    • I wrote two sets of questions. Each took approx. 15 min., but much of that time was in testing the answers to make sure that I had correctly written down what should be correct (and that none of the other MCQ answers were accidentally correct).
    • The python assessment had an unintentionally very ambiguous question, because a feature that I thought is default on all systems turns out not to be. The other questions seemed to be pretty unambiguous (with the shell navigation caveat below), although I did get conflicting (and thus helpful) feedback about whether assessment questions should also teach something in the process of answering them.
    • I did receive feedback that the shell navigation questions might create false negatives (i.e., might be confusing). As the commenter noted, this is probably better than creating false positives, but it’s still definitely a potential problem.
    • I actually learned a few additional things about the topic itself while writing the questions, so I learned more about the topic, in addition to learning more about how to assess about the topics.
      • One issue I did find myself having to think a lot about was creating assessments that could be “graded” quickly — normalizing output, for example.
  • Genevieve Smith (UT Austin) http://teaching.software-carpentry.org/2014/05/13/multiple-choice-questions-about-lists-in-python/
    • Approximately half an hour for two multiple choice questions
    • I think mine were comparable, if perhaps a bit easy
    • I think the questions would be confusing before the class
    • I think they’d be relatively good assessments after the class, although I made some assumptions about exactly what topics would be covered
    • I don’t think I really revised my thinking on how to approach the topic — the questions are aimed at reinforcing practicing, rather than more conceptual questions.

10:00 Eastern

  • Greg Wilson (Mozilla, Toronto): no assessment URL
  • Jon Duncan (University of North Carolina) http://teaching.software-carpentry.org/2014/05/13/assessment-questions-for-concept-map-on-basic-syntax-of-an-r-function/
    • It took about 20 minutes, maybe 30 start to finish
    • It looks like people understood the questions and had a couple of nice points that I had missed
    • For formative assessment- I think it would have to be in a group of students that have had at least some programming experience
    • It’s a good start to assessing knowledge- at least basic understanding,
    • Yes, I definitely have a clearer idea of how to teach the topic
  • Christian Jacobs (Imperial College London, UK): http://teaching.software-carpentry.org/2014/05/08/assessment-python-dictionaries/
    • It took me about an hour to create my two MCQs.
    • I think the questions were reasonably clear based on the reviews.
    • I think the terminology used in the MCQs might lose a lot of students, but I think it could assess knowledge (or lack of knowledge) before a lesson reasonably well.
    • I think MCQs would better assess knowledge after the class, but it would only test a very small part of the topic.
    • Yes, I think so.
  • Timothy Warren (University of Washington, Seattle) http://teaching.software-carpentry.org/2014/05/17/assessment-for-python-dictionaries-3/
    • About an hour.
    • Helpful feedback pointed out that my directions for 2nd question (not multiple choice) were unclear.
    • I think my question would provide useful assessment prior to teaching.
    • Ditto for assessement after teaching.
    • Yes — questions useful for design of teaching.
  • Mark Wilber (University of California, Santa Barbara): http://teaching.software-carpentry.org/2014/05/15/assessment-for-python-functions/
    • It took me about 30 min to create my question
    • I thought my question was pretty clear based on the responses
    • It would do  decent job of assessing the learner’s knowledge before.  I don’t think they could answer it without the lesson
    • Same as above.
    • I do.  Thinking about specific situations in when you would apply a function helped me draw analogies to “real-life” examples that I think the students would be able to relate to.
  • Dan MacLean. Sainsbury Lab, Norwich, UK http://teaching.software-carpentry.org/2014/05/12/assessment-for-concept-match-for-regular-expressions/
    • about 10 minutes
    • I think it was clear. No responses though…
    • badly, actually, either they’d have no clue or know it, not much gradation
    • hopefully well. They have a real result and need to evaluate it against the concept map
    • I think so.
  • Aur Saraf (Tel Aviv): http://teaching.software-carpentry.org/2014/05/15/assessment-for-python-dicts-aur-saraf/
    • Maybe half an hour
    • was your question? I had one person answer them and he understood them very well, however he seems to be much above the level of my students :-)
    • A person familiar with the material would easily answer both questions correctly
    • I hope I’d be able to teach in five minutes enough compressed material to answer the questions. There is some risk that they’re too advanced for five minutes.
    • I’ve taught it enough times already that I already had a good idea
  • Hsingtzu Wu (Japan) http://teaching.software-carpentry.org/2014/05/11/2-mcqs-for-isabels-concept-map-about-loops/
    • a few min
    • seems ambiguous
    • no, maybe not accurate, because this assessment was designed for the specific material. this does not test general knowledge of programming
    • maybe just OK, because this assessment is based on the concept map.
    • maybe. I know what answers I would like from the students.
  • Scott Burns (Vanderbilt University, Nashville) http://teaching.software-carpentry.org/2014/05/14/assessment-questions-for-list-comprehension-concept-map/
    • 20 minutes
    • Could have used more simple language in the answers to diminish ambiguity
    • I wrote the questions very much against the concept map, so vocabulary introduced by that might not be known before hand
    • I think it would assess the specifics of the topic.
    • Yes, really understanding what the concept map teaches (and doesn’t!) helps limit the scope of the questions and vice versely the lesson.
  • Jeff Hollister (US EPA, Rhode Island) http://teaching.software-carpentry.org/2014/05/15/assessment-for-data-structures-in-r/
    • Took me about 10-15 minutes
    • The one answer I got was correct! So clear enough, I suppose.
    • For what I tested I think it did a good job.  Perhaps a bit too specific though as it only deals with a very small part of data structures in R.
    • Same as above.  I think it would test it well, but tests only a small part of the topic.
    • Having the questions first would certainly help direct what I wanted to cover. So, although I said above that it was maybe to specific, I am going to contradict myself now.  The specificity in this case provides a clearer direction for what to cover.
  • Florian Rathgeber (Imperial College London) http://teaching.software-carpentry.org/2014/05/14/assessment-for-python-dictionaries/
    • 1) took me about 1h (including deciding which assessment to find MCQs for)
    • 2) right answer for Q1 could have been a bit more “beginner friendly”. Q2 was maybe a bit too advanced, I should have at least given concrete lists to work with
    • 3) novices probably wouldn’t be able to answer, so it could identify advanced learners
    • 4) people should be able to answer Q1, for Q2 they should at least find 1 way
    • 5) yes, although I was trying to find MCQs different from another person doing the same assessment, so I might have chosen differently, had I been the only one doing the assessment
  • Graham Etherington, Sainsbury Lab, Norwich http://teaching.software-carpentry.org/2014/05/13/assessment-for-victoria-offords-grep-pattern-in-array-perl/
    • Took me about half an hour to think about and write.
    • One of my questions was a bit ambiguous, as a number of options would have worked, but I just asked for the simplest solution.
    • I think it pretty well assessed how much knowledge a student had on the subject.
    • I’d probably be a bit more specific in my questions.
  • Huayan Gao, CUHK http://teaching.software-carpentry.org/2014/05/21/assessment-for-sql-select-statements/
    • Took me about 20 minutes
  • Mark Stillwell (Cranfield University, UK): http://teaching.software-carpentry.org/2014/05/14/assessment-for-learning-github/
    • About 20 minutes
    • I think my question was fairly unambiguous
    • The long form question would be good for pre-class assessment. The MCQ would not be very informative for this.
    • The MCQ would show how well the students were paying attention to the lesson, while the long form question would show deeper knowledge of the topic.
    • I do think I have a better idea once the method of assessment has been sorted out. I’ve been learning a lot about constructive alignment lately…