See Atul Gawande’s 2007 article “The Checklist” for a look at how using checklists can save lives (and make many other things better too).
Scheduling the Event [Coordinator]
- Decide if it will be in person, online for one site, or online for several.
- Talk through expectations with the host(s).
- If it is in person, make sure the host knows they’re covering travel costs for trainers.
- Determine who is allowed to attend.
- We strongly prefer trainees to have attended workshops (as learners or helpers).
- Other criteria may be negotiated by the Executive Directors as part of partnership agreements.
- Arrange trainers.
- Arrange space.
- Make sure there are breakout rooms for video recording.
- Choose dates.
- If it is in person, book travel.
- Get names and email addresses of attendees from host(s).
- Register those people in AMY.
- Email attendees a welcome message that includes:
- a link to the workshop home page
- background readings
- a description of any pre-requisite tasks
- Make sure attendees will all have network access.
Setting Up [Trainer]
- Create an Etherpad (http://pad.software-carpentry.org/
- Set up a one-page website for the workshop using https://github.com/swcarpentry/training-template as a starting point.
- Send the URL to the admins.
- If it is online:
- Test the video conference link.
- Set up meeting with the hosts to make sure the bluejeans channel works and give you a change to meet “face-to-face”
- Check whether any attendees have special needs.
During the Event [Trainer]
- Introduce yourself (see detailed guide below).
- Ask your trainees to introduce themselves to each other.
- Remind everyone of the code of conduct.
- Collect attendance.
- Distribute sticky notes.
- Use the etherpad.
- Collect participants’ GitHub IDs (if they are interested in teaching Software Carpentry).
- Go through the checkout procedure point by point.
- Explain how we format lesson submissions.
After the Event [Trainer]
Between Instructor Training Sessions [Trainer]
- Sign up to lead teaching demonstrations.
- Email a list of trainees who participated in teaching demo to email@example.com. Note whether they passed or failed.
After Trainees Complete [Head of Instructor Training]
Note that trainers do not examine their own trainees: having them examine each other’s helps balance load and maintain consistency of curriculum and standards.
You may use the following message templates to communicate with trainees:
To begin your class, the instructors should give a brief introduction that will convey their capacity to teach the material, accessibility/approachability, desire for student success, and enthusiasm. Tailor your introduction to the students’ skill level so that you convey competence (without seeming too advanced) and demonstrate that you can relate to the students. Throughout the workshop, continually demonstrate that you are interested in student progress and that you are enthusiastic about the topics.
Students should also introduce themselves (preferably verbally). At the very least, everyone should add their name to the Etherpad, but its also good for everyone at a given site to know who all is in the group. Note: this can be done while setting up before the start of the class.
Have students write answers to the initial MCQ in the Etherpad or import it into socrative using this ID: SOC-25251122. Briefly summarize the answers.
Learners do think-pair-share for cognitive maps and multiple-choice questions.
In the two-day versions, have learners read the operations guide as their overnight homework and do their demotivational story just before lunch on day 2: it means day 2 starts with their questions (which wakes them up), and the demotivational story is a good lead-in to lunchtime discussion.
Don’t have them complete the Teaching Perspectives Inventory or read through the pre- or post-assessment questionnaires in class: it kills momentum.
If there are people among the trainees who don’t program at all, make sure that they are in separate groups and ask to the groups to work with that person as a learner to help identify different loads. Another option is to have a faded example that is not programming specific. But that may be difficult to achieve.
One of the key elements of this training course is recording trainees and having them, and their peers, critique those recordings. We were introduced to this practice by UBC’s Warren Code, and it has evolved to the following:
On day 1, show trainees a short clip (3-4 minutes) of someone teaching a lesson and have them give feedback as a group. This feedback is organized on two axes: positive versus negative, and content versus presentation. The first axis is explained as “things to be repeated and emphasized” versus “things to be improved”, while the second is explained by contrasting people who have good ideas, but can’t communicate them (all content, no presentation) with people who speak well, but don’t actually have anything to say.
Trainees are then asked to work in groups of three. Each person rotates through the roles of instructor, audience, and videographer. As the instructor, they have two minutes to explain one key idea from their research (or other work) as if they were talking to a class of interested high school students. The person pretending to be the audience is there to be attentive, while the videographer records the session using a cellphone or similar device.
After everyone has taught, the trio sits together and watches all three videos in succession, writing out feedback on the same 2x2 grid introduced above. Once all the videos have been reviewed, the group rejoins the class; each person puts all the feedback on themselves into the Etherpad.
In order for this exercise to work well:
Groups must be physically separated to reduce audio cross-talk between their recordings. In practice, this means 2-3 groups in a normal-sized classroom, with the rest using nearby breakout spaces, coffee lounges, offices, or (on one occasion) a janitor’s storage closet.
Do all three recordings before reviewing any of them, because otherwise the person to go last is short-changed on time.
People must give feedback on themselves, as well as giving feedback on each other, so that they can calibrate their impressions of their own teaching according to the impressions of other people. (We find that most people are harder on themselves than others are, and it’s important for them to realize this.)
At the end of day 1, ask trainees to review the lesson episode you will use for the live coding demonstration at the start of day 2.
Try to make at least one mistake during the demonstration of live coding so that trainees can see you talk through diagnosis and recovery, and draw attention afterward to the fact that you did this.
The announcement of this exercise is often greeted with groans and apprehension, since few people enjoy seeing or hearing themselves. However, it is consistently rated as one of the most valuable parts of the class, and also serves as an ice breaker: we want pairs of instructors at actual workshops to give one another feedback, and that’s much easier to do once they’ve had some practice and have a rubric to follow.
Note on closed captioning: Sometimes audio in the room can be poor. Two suggestions for improving accessibility are to have students watch it on their own laptops and to turn on closed captioning by pressing the
cc button at the bottom of the video.
Part 1: how not to do it
Part 2: how to do it right
In the exercise on brainstorming motivational challenges, review the comments in the Etherpad. Rather than read all out loud, highlight the common themes (i.e. establish value, positive expectations, promote self efficiency) or things that stand out our that you can relate to. Note: this exercise can be done before or going through the above list.
In the exercise on brainstorming demotivational experiences, review the comments in the Etherpad. Rather than read all out loud, highlight a few of the things that could have been done differently. This will give everyone some confidence in how to handle these situations in the future.
In 2014, George Monbiot wrote:
If we had set out to alienate and antagonize the people we’ve been trying to reach, we could scarcely have done it better. This is how I feel, looking back on the past few decades of environmental campaigning, including my own…
Experimental work suggests that when fears are whipped up, they trigger an instinctive survival response. You suppress your concern for other people and focus on your own interests… Terrify the living daylights out of people, and they will protect themselves at the expense of others…
A lot of advocates for open science and reproducible research make the same mistake. They frighten people with talk of papers that have been retracted when they should talk about all the new science people could do if they weren’t wasting hours trying to figure out how they created figure number three in the first place.
We have found that we have more impact when we emphasize how much more researchers can do when they are computationally competent. We have also found it’s importance for us to emphasize that what we teach and how we teach it is based on the best available evidence. We use live coding instead of slides because research shows that people learn more from doing than watching. Similarly, the tools we teach are ones that our instructors—who are active researchers themselves—use daily.
One final point to make in instructor training workshops is that our greatest impact may be what we teach our instructors about teaching and collaborating. As a species, we know as much about education as we do about public health, but since most university lecturers are self-taught teachers, they are completely unaware of this body of knowledge. At the same time, the massive, open collaboration that has made Wikipedia and open source software successful has never taken hold in teaching. Most university lecturers are still the sole creators and consumers of their lessons, which wastes time and impedes the spread of good ideas. Changing that could have more impact in the long run than anything to do with for loops and pull requests.
Discussion of the practical implications of learning concepts brings us to our next big idea: people learn best when they care about the topic and believe they can master it. Neither fact is particularly surprising, but their practical implications have a lot of impact on what we teach, and the order in which we teach it.
First, most scientists don’t actually want to program. They want to do scientific research, and programming is just a tax they have to pay along the way. They don’t care how hash tables work, or even that hash tables exist; they just want to know how to process data faster. We therefore have to make sure that everything we teach is useful right away, and conversely that we don’t teach anything just because it’s “fundamental”.
Second, believing that something will be hard to learn is a self-fulfilling prophecy. This is why it’s important not to say that something is easy: if someone who has been told that tries it, and it doesn’t work, they are more likely to become discouraged.
It’s also why installing and configuring software is a much bigger problem for us than experienced programmers like to acknowledge. It isn’t just the time we lose at the start of boot camps as we try to get a Unix shell working on Windows, or set up a version control client on some idiosyncratic Linux distribution. It isn’t even the unfairness of asking students to debug things that depend on precisely the knowledge they have come to learn, but which they don’t yet have. The real problem is that every such failure reinforces the belief that computing is hard, and that they’d have a better chance of making next Thursday’s conference submission deadline if they kept doing things the way they always have. For these reasons, we have adopted a “teach most immediately useful first” approach described in this episode.
Software Carpentry Is Not Computer Science
Many of the foundational concepts of computer science, such as computability, inhabit the lower-right corner of the grid described above. This does not mean that they aren’t important, or aren’t worth learning, but if our aim is to convince people that they can learn this stuff, and that doing so will help them do more science faster, they are less compelling than things like automating repetitive tasks.
This course has been taught as a multi-week online class, as a two-day in-person class, and as a two-day class in which the learners are in co-located groups and the instructor participates remotely.
This was the second method we tried. The biggest change was the introduction of recorded teaching exercises.
Several times during the training, participants are divided into groups of three and asked to teach a short lesson (typically 2-3 minutes long). In turn, one person is the teacher, one the audience, and one the videographer, who records the teacher using a handheld device such as a phone. Group members then rotate roles: the teacher becomes the listener, the listener records, and the videographer teaches. Once all three have finished teaching, the group reviews all three videos, and everyone gives feedback on everyone (including themselves). This feedback then goes into the Etherpad for discussion.
It’s important to record all three videos and then watch all three: if the cycle is teach-review-teach-review, the last person to teach runs out of time. Doing all the reviewing after all the teaching also helps put a bit of distance between the teaching and the reviewing, which makes the exercise slightly less excruciating.
This exercise only works if there are breakout rooms available: if everyone is trying to record in the same room, the audio cross-talk makes the recordings unintelligible.
We use Etherpad for in-person training, both for note-taking and for posting exercise solutions and feedback on recorded lessons Questions and discussion are done aloud.
We use Google Hangouts and Etherpad as in the multi-week version. Each group of learners is together in a room using one camera and microphone, rather than each being on the call separately. We have found that having good audio matters more than having good video, and that the better the audio, the more learners can communicate with the instructor and other rooms by voice rather than by using the Etherpad chat.
We do the video lecture exercise as in the two-day in-person training.
This was the first method we tried.
We meet every week or every second week for an hour using Google Hangout or BlueJeans. Each meeting is held twice (or even three times) to accommodate learners’ time zones and because video conferencing systems can’t handle 60+ people at once. Each meeting also uses an Etherpad for shared note-taking, and more importantly for asking and answering questions: having several dozen people try to talk on a call hasn’t worked, so in most sessions, the instructor does the talking and learners respond through the Etherpad chat.
Learners post homework online, then comment on each other’s work.
Checklist for instructor trainers hosting a live-coding demo session as part of a trainee’s checkout procedure.