It's no secret that moderated user testing can deliver outsized value for product teams. Interacting with real people helps us to see beyond our own perspective and validate our designs or inform changes.
But when industry standard platforms, like usertesting.com, start at 30k per month it's no wonder stakeholders are reluctant to invest in the process.
Luckily, we stumbled upon a dramatically cheaper option that doesn't skip out on quality. We combined the Userinteviews and Lookback platforms to inform a redesign for our eCommerce client charlottesweb.com—and were pleasantly surprised with the results.
Not only did these tools streamline our process by managing all of the administrative tasks associated with the process, they helped us deliver high quality feedback to our clients for hundreds of dollars instead of thousands.
The platforms explained
High quality candidates without the busy work
Userinterviews is a highly flexible user testing recruitment platform that does all the heavy lifting for you. Based on your desired characteristics, they will source qualified candidates from their own pool of vetted candidates. If they don't have any candidates on hand that meet your specific criteria, they will recruit from LinkedIn or Facebook. Furthermore, they will even schedule the interviews for you based on your availability, removing all of the menial administrative work from the equation. An unbeatable value at $20 per candidate (plus a participant incentive of your choosing). Read this to learn more.
All the resources of a research lab without the overhead
Lookback makes moderated (or unmoderated) user testing easy by allowing you to record the user's voice, face, and screen while you take time-stamped notes of their behavior. Once the participant downloads the app or Chrome plug-in, send them a link to join the testing session and you're ready to begin. Lookback works with Apple and Android devices of all kinds. Read this to learn more.
How it worked
1. Initial setup
We started by setting our available schedule for user testing and selecting participant demographic criteria. We also created a screener survey to help Userinterviews narrow our pool of candidates. Within a day our dashboard started populating with several candidates to choose from, and within a few days we had over 100 options. After selecting the candidates we liked the most, our recruiter reached out to the participants to schedule the testing sessions on our behalf.
2. Test preparation
Using our newly created Lookback account, we messaged our candidates on the Userinterviews dashboard to confirm our scheduled time and to send them a link with instructions for downloading the lookback app (there is also a chrome plug-in for desktop). We only had an issue with one participant who could not download the app. We were able to quickly and easily swap them out for a different candidate, free of charge.
3. Conducting the test
The day of each interview, we opened Lookback at the scheduled time and initiated the session. Once the participant joined, the audio and video recording began and we were off to the races. At first we were able to converse directly via video chat, which was useful for introducing ourselves and our process. Once we were ready to begin testing specific tasks, the participant would hit a button that enabled us to view their full screen as well as a video thumbnail of their face. This way, we had a video recording of their onsite activity as well as their verbal and facial reactions.
4. Test review
Following the interviews, we had access to the entire session recordings for review. We opted for the pro tier plan which allowed us to create and share clips—important for sharing key learnings with stakeholders.
The complete BranchLabs process
Proposing user testing to the client
- We created a benefit-oriented slide deck that concisely communicated why the client should invest in user testing, how the results would inform improvements to the site, and what the process and cost of testing would be.
Moderating the interview
- We introduced ourselves, the intentions of the test, and confirmed consent using a modified script from Rocket Surgery Made Easy by Steve Krug (Download the free script pdf).
- We asked that the participants to speak their mind as much as possible and prompted them to do so whenever they were engaged with the test but not speaking aloud.
- We had tasks prepared for the interview but as much as possible we gave them free rein to navigate through the client website without direction—our intention was to influence their experience as little as possible.
Collecting & analyzing user data
- We took all of our feedback notes we had entered during the user test and copy/pasted them into a spreadsheet (about 100 items).
- Once in the spreadsheet, we color-coded the items that were similar in order to identify which issues were relevant for most participants.
- We took the most common issues and organized them into three themes.
- We reviewed our notes in Lookback to find clips that best represented each theme and linked them within the spreadsheet.
Presenting our findings to the client
- We created a deck that focused on the three themes we had identified during data analysis.
- We had dozens of great clips to present, but ultimately narrowed them down to less then 10 at 60 seconds or less each to make sure they communicated the message quickly and effectively.
- At the end of the deck we presented next steps to the client, focusing on high impact, low cost opportunities first.
- $20/participant + participant incentive (we chose $40)
- Monthly/annual plans ranging from $49-$119 per month
We can say with confidence that UserInterviews and Lookback are offering the best value for remote moderated user-testing. We were able to deliver high quality feedback to our clients without a hitch.
Learn more about these services: