In October 2016, I had my first opportunity to provide support on two different User Testing Journeys. Here is what I learned.

If you’re a UX designer, you’re already a UX tester.

Before this experience, I believed user testing was a something one needed expertise on or at least some training. I was like many designers who thought you always needed to pay a big user testing firm to complete your research for you. There had to be some user testing secrets that only the educated and experienced knew about. But I soon discovered that jumping from a UX designer to a UX tester is not a big leap.

The same questions I ask myself all day – Can a user do this? Is this easy to understand? – are the same questions you ask when testing a user. Personas I created translated exactly into the kind people we searched for to take the test. 

The less you know the better.

A huge advantage in participating in a user test is knowing nothing about the history of the project, the design, or why any of the decisions were made. I feared when I started my second user testing assignment that I didn’t know enough about the project, but I soon learned my ignorance would work in my advantage.  First, I was able to help write the script from the same perspectives as my testers would have. Fresh eyes were also good for catching inconsistencies and bugs in the prototype. It also helped during the actual testing because when I asked a question, it kept me from giving out too much information that could have influenced theuser’s behavior. But most importantly, not being involved in the design kept me unbiased towards the answers. You’re more inclined to admit a design isn’t working when it’s not your baby.

Users are smart.

I tend to spend all day underestimating users. No matter the persona, I still seem to constantly imagine an eighth grader or my grandmother. Watching user tests bring me back to reality that most people can grasp complex concepts if correctly presented to them. While this was a nice wake-up call, I have to trust that my process of underestimating my users in my head resulted in the successfully testing. 

Don’t worry about testing a lot of people.

In the two tests I did, one project had 10 testers and the other 5. It became clear very quickly why it’s unnecessary to test more than 10 people. Each time we went through the test, the users tended to have the same issues and comments. But while all testers had generally the same comments, almost every testers had an outlier suggestion or mishap. What I still have to learn is how to handle those one-offs. Should all individual comments be ignored or considered?

All User Experience Designers should involved in testing.

As user experience designers, we constantly try to be in our user’s head and understand what they want and how they think. So after participating in user testing, it’s a big surprise to me that we separate testers and designers and not involve the two. I can spend all day imagining my users but it’s a great eye-opener to actually meet them. Whether or not I am invited to participate in user testing from now on, I will always ask to watch the testing videos because of the insight I gather from them.

Always be learning.

My favorite part of UX testing is the great deal I learn from it. One example is when I was concerned about making an experience as short as possible so people can continue on with their lives. But what I came to find is my users didn’t get frustrated with the amount of time they spent doing a task, they only got frustrated with the amount time they spent confused or doing redundant tasks. We also worried about having too much copy on a page but I often had more requests for copy, not less. These decisions live on a fine balance that can always be tricky. This is where user testing can be particularly helpful by discovering where we can add more, take off, or clarify.

Make sure you’re asking the right questions.

During user tests we are always doing our best to recreate the real world experience in the best way possible. But the fact that we can’t perfectly recreate conditions might be enough to completely negate some of our test results.

For example, we asked one group of testers if they could use our application to record the greeting for their business’s answering machine. Hitting a “record” and “stop” button is a simple enough task with common patterns so we had zero problems with this as we expected. But we did not test the human elements that would make this task frustrating in the real world. For instance, does the user know the greeting they would like to record? Do they have to get it approved by someone else before it can be officially recorded? What if their computer doesn’t have a microphone? What if their computer is not in a quiet place for recording? Any one of these questions could create real frustration and confusion. 

I found that testers don’t consider all these other real-world elements when answering questions like I thought they might. From now on, I will note the possible environmental distractions or hang-ups and make sure the testers are considering those factors and ask more directed questions to discover where they might find additional frustrations.