2020's lesson: meet your learners at their confidence level

Laurie Mezard
-

Running a new program for the first time is always a humbling and insightful experience. Assumptions are challenged. Some things work well right from the start, and others don’t. If you pay attention to the feedback, you get many opportunities to iterate and improve on this first experience.

This year Rbean ran for the first time the Tech Manager program with Matrice (More details about Tech Manager here). It’s a part-time coding, part-time management program.

Now, it isn’t our first time designing and running coding programs. But it’s our first time designing for a program that’s only part-time our speciality. It’s also our first time not recruiting for mindset first.

In previous programs, the candidate selection always included an intensive, experiential bootcamp. During this bootcamp, students would face many challenges and be asked to solve them as a community, without relying on direct instruction. Our role was to be guides on the side and we would support and feedback the work, but without ever giving the answers upfront.

This process allowed candidates to try out our challenges-based approach and to decide if it was for them before committing to it. They either loved the method - and were passionate about the school - or dropped out of the bootcamp quickly.

This time was different. Instead of being a selective step, the bootcamp became a regular module in the overall program. Dropouts were therefore not an option, as students had already committed to the program.

Because we anticipated some difficulty, we beta-tested the bootcamp with non-developers. We got mostly positive feedback. Reassured, we went ahead.

But learners actually struggled on the bootcamp, and even with projects later on. Like, really struggled. Many were confused and frustrated by the problem solving process. Some procrastinated their discomfort away. Most kept on bravely and learnt a lot but, nevertheless, I believe the experience was more painful that it should have been.

The mistakes we made

In retrospect, here are the main mistakes we made:

  • Mismatch with learner aspirations
    The Matrice program is built and marketed as part-coding, part-management. That means that not all learners want to become software engineers (although some do). This plays a big part in how much they are willing to stick with the frustrations that come with solving coding problems.

  • Not realising our approach didn’t suit less confident learners
    Before, bootcamp participants who didn’t feel comfortable with our approach would simply drop out. We were never forced to address the discomfort of less confident learners.

  • Not setting the right expectations
    The program wasn’t marketed as a highly challenging program, which would have attracted people who already enjoyed solving challenges. Instead, we had learners who had less experience facing challenges and needed more guidance on the process itself.

  • Not testing with the end users
    We beta-tested the bootcamp with people who didn’t match the target beneficiaries closely enough. We tested it with people who were already used to problem solving and working in an innovative space. In effect, we got useless feedback.

What’s dangerous is that learners don’t tell you directly when your design makes them feel stupid! If you don’t pay enough attention there’s a risk of everyone (your learners included) putting it down to a lack of ability or efforts.

But as an instructional designer, it’s your job to make sure learners feel safe and confident. It’s my job to figure out learners’ needs and to meet them there.

The next iteration to provide a more guided & secure experience

So, the whole program starts again in September and right now we are working hard on how to make the learning experience more guided and the environment more emotionally secure.

This includes:

  • Video tutorials to accompany the existing written tutorials. Some learners are more confident when they can watch you go through the motions
  • More beginner challenges, to provide quick wins that will sustain learner motivation
  • Hard challenges will be unlocked depending on performance, in order not to pressure beginners
  • Self-assessment quizzes will accompany projects, to assist students in the problem solving process and help them self-diagnose the issue whenever they get stuck.
  • More examples and modelling of efficient problem solving
  • And some gamification features to provide more meaningful goals & fun

It might not solve all problems at once, and by implementing several measures together it will be difficult to assess which ones had the biggest impact.

However, making sure learners feel safe, secure and confident in their skill acquisition is our priority right now.

We’ll have many more opportunities to fine-tune the system down the line!

Laurie Mezard picture
Laurie Mezard

Co-Founder & Pedagogy Specialist


If you enjoyed this post, you might also like: