Quick musing on the pilot programme for the learning replatforming project earlier this year.

Getting feedback is the principal aim of a pilot programme. A good pilot doesn’t let learners loose on a new platform or piece of software without requiring them to complete a predefined set of tasks, for example working through common scenarios like signing on and amending your user profile.

UAT testers are usually presented with a pack that includes a set of instructions and a spreadsheet for their feedback. This could be a simple set of tasks with a column asking them to tick yes or no for whether or not they could complete the task. Or it could be something a lot more complex, with ticks for each stage. In any case, this testing methodology carries a lot of risk.

  • Testers might disregard the spreadsheet completely and send you an email detailing what they did and what happened. That often results in lots of work for you to decipher and turn their narrative into usable, comparable data.
  • Testers might get bored (guilty!) and only complete the tasks they were interested in, giving you incomplete data. This could result in skewed or inaccurate conclusions.
  • Testers do the testing and forget to complete the spreadsheet until a week after the pilot, so they end up completing the form from memory. Again, this could result in skewed or inaccurate conclusions.

When we were thinking about how we wanted to run the pilot programme for the replatforming project we had to think about our testers. They are busy people. It took the entire team and a couple of friendly senior leaders about a month to strong-arm persuade 50 people to give up about an hour of their time to join the pilot test group, and we knew that we had to make best use of their time and get some really good data. We kicked around some ideas for spreadsheet-based feedback forms, and then our excellent project lead wondered how we could capture the experience with an online form. That was enough to send me into a developer huddle with myself.

I created an online testing guide using Microsoft Forms – it’s a familiar tool for our users and ensured full data security. Instead of a full-on Excel worksheet, we supplied the users with a link to the online form and asked them to work through the testing tasks by supplying answers to each question. If the tester marked the task as successful, they were automatically moved on to the next task. If they said they couldn’t complete the task, we asked them to provide more details.

Because the only way into the test was to complete the form, we could monitor engagement (and gently nudge non-participants) and our testers didn’t have to worry about completing the spreadsheet, saving it and emailing it to our team. We also had a clean dataset to work from when it came to analysis.

We also wanted to be on hand to answer any questions about functionality, connectivity etc, so we set up a Teams chat for the testers. Teams chats are brilliant for instant feedback and identifying potential issues before they became problems.

The results were everything we needed them to be: comprehensive, accurate and easy to analyse. All this meant that we could identify and solve possible blockers to launch quickly, and the testers were so enthused by our methodology and the platform, they acted as ambassadors and champions in the days before launch.