User testing is an essential tool for developers to understand how users respond to and interact with their product. Watching a user interact with a product in development can provide vital insights that a survey, interview, or analytics just can’t provide. User testing does not require a big budget or a big team. Even if you can’t do user testing perfectly, doing whatever version of user testing is possible for you, you will be able to learn a lot.

When it comes to user testing, I have been on both sides of the table. I have run dozens of user testing sessions for the language learning game we are developing. Out of a desire to truly understand our testers, I have also participated as a product tester for a few different companies. These are my top three recommendations on how to run better user tests based on those experiences:

1) Find the language that encourages your tester to give you constructive feedback. While the majority of a user testing session should be focused on observation, getting the user to open up and verbalize what they don’t understand and what they don’t like is essential. Testers, however, sensing how much work you have put into your product, often don’t want to say negative things to you. Open your testing session by explaining how much their feedback will help your development because you know there are still things wrong with your product. Try providing an anecdote for when someone’s constructive feedback resulted in a positive change to your product.

2) Make the experience less intense for the tester. Keeping them at ease will allow you to build better rapport and get feedback out of them in a more realistic manner. This is a user test, not a stress test! I was a tester with one company that had three people hovered over me using a mobile device. Keep the number of observers to a minimum. I recognize that having others in the team see the test can be incredibly important. Consider using a video camera to record the session. As a tester, my best experience was an entirely recorded session. I felt the most relaxed and confident in the feedback I was providing. Of course that limits the follow up questions to whatever you decided in advance so find a happy medium.

Along the same lines, I had a conversation with someone recently who stated she hates user tests because as a tech person, she feels she should be able to figure out how to use the product and is then stressed when she can’t. Make sure to communicate to your testers that if they have trouble figuring out your product that it is your fault as the developer, not theirs! Let’s help people love the user testing experience so that we have a robust pool of people excited to test products.

3) Remember to stay focused on the observation component and not get too lost in follow up questions. While some pointed follow up questions can be helpful, remember, users don’t necessarily know what they want. Asking a question such as “If we did x, would you like the experience better?” will get you nowhere. If you think x change might help, prototype it and test it. An example of a great follow up question, however, is “what did you find confusing about x?”. I have been a tester when the session should have been ended. Those running the test, however, felt they needed to use the entire allotted time and so they kept asking me tons of questions that I could do nothing but take wild stabs at. This obviously will result in confusion.

User tests are vital to developing a great product. I hope these tips can help you run better user testing sessions. I am sure there are tons more great ideas out there. What else have you found helpful?


So you have a great idea, now what?  You have probably heard plenty about the Lean Startup Movement and maybe have read Eric Reis’ book The Lean Startup but putting lean into practice is really hard.  You need more information before you build but the problem is that building product is fun and customer interviewing isn’t always so fun.  Steve Blank’s  book The Four Steps to the Epiphany gives tons of great info on customer development but is a lot of info at once.  The goal of this post is to provide a basic overview of the early step of customer interviews and some practical ideas to get it done.

I have spent the last several months making  and launching MVPs for a few different ideas that I have had.  The  first one, I just did a survey,  convincing myself that it would be good enough…it wasn’t.

Surveys fail because they don’t give you the nuance of people’s reactions and they don’t allow you to ask follow up questions.  Our first MVP was for a website that aimed to provide an informative, gamified platform for people interested in living more sustainably.  By using surveys we were stuck with limited data that encouraged us to invest the time to build a website that wasn’t worth building.  The data confirmed our idea, and we humans love nothing more than validation, however it hid major flaws.

The first tricky part is finding customers to actually interview.  Here are some of the strategies that I think work for this, though I would love to hear from others if they have additional ideas.

B to C Strategies:

  • Community groups to find individuals willing to talk
  • Mechanical Turk – there is a great tutorial on Customer Development Labs on getting interviews set up this way
  • Cold approach – ie standing outside of relevant places of business (with permission) and walking up to people, while not very fun, can be effective

B to B Strategies:

  • Introductions through your network
  • Cold calling

Crafting your customer interview questions is the next tricky step.  The problem is that people are generally nice and don’t want to tell you bad things about your idea.  They also don’t really know what they want.  Check out this talk by Rob Fitzpatrick on Getting Customer Development Right.

Some of the major questions you want to answer in customer interviews include:

  1. Do people actually have the problem I think they have?
  2. How big of a pain point is it for them and will they actually pay to solve this problem?
  3. How are they solving this problem currently?
  4. How easily can your solution integrate into your customer’s life or will there be major roadblocks to adoption?
  5. Any other assumptions that you are making that determine the success of your business? The business model canvas is a great tool for identifying the assumptions your business idea depends on.

When I did interviews for my next product, FitCycle, I found myself falling into pitch mode really easily… just don’t do it.  FitCycle is an app that provides indoor cycling workouts, including motivational music and instruction, via the convenience of your phone.  Here are some examples of better questions I eventually got to:

Key assumptions

  • People can’t always make it to an actual class
  • People are bored and looking for solutions to their regular cardio routine
  • There aren’t great solutions out there currently
  • People have access to a spin bike


  • How happy are you with your current cardio routine?
  • Do you attend spin classes regularly?  If not, why not?
  • Do you belong to a gym with spin bikes you can use or do you have a spin bike at home?
  • Do you try to do spin workouts on your own?  If not why not?

Armed with all the great information you get out of customer interviews, your original vision of an MVP will likely change, and that’s a good thing… because now it is based on something more concrete than an idea you think is cool.

Thanks to the guys at Lean Startup Peer-to-Peer Circle for helping me figure a lot of this out.