Web Testing for TechSmith

Methods and Tools Used: 
• Unmoderated usability testing (UserZoom)

• Card sorting (UserZoom)

• Tree testing (UserZoom)

• Hotspot click testing (UserZoom)

As a user researcher at TechSmith, I worked heavily with customer-facing web content to make sure the customer needs and allows them to reach their goals in the least amount of time and the least amount of hassle. 

In collaboration with the UA, UX, and development team, I  worked to uncover the ideal user experience and improve the Sign In Portal for TechSmith.com and the Zendesk help documentation system. I implemented multiple testing methods for our unmoderated tests, where I created tasks that included task-based usability tests, card sorting, tree testing, and hotspot click testing. 


TechSmith wanted to give their customers a portal that allowed them to input product information for ease of use. This portal had users utilize an email or social account to create an account, add keys for product they own, purchase new products, and input customer information such as their address and profile picture.  To do so, the portal was created and linked from the homepage on TechSmith.com.

The issues that quickly arose showed that time-on-task for the Sign-In experience for the portal was low, and it seemed users had a hard time recovering their password or reading a help article to figure out how to do so. There also was a lack of understanding of what the user’s goals were within a portal experience and what kind of features would make a better user experience. To uncover the ideal state of the portal, we needed a mixed method approach. This meant pairing exploratory research with traditional testing methods to uncover the answers to these questions.


Improving the Incentive and Onboarding Experience for Participants

As each round and method was conducted, I was tasked with creating an incentive and onboarding for test structure so designers and developers could run their own tests using UserZoon. The goal was for the other researcher and myself to figure out a better incentive structure that allowed us to move fast on different methods by getting more engagement and task completion. 

During every round of the usability tests, I created screeners for recruiting, introductory instructions, which included screen-grab videos, examples of proper use, and infographics. This was due to a lot of drop-off we noticed, since we usually recruit from various pools of customers and non-customers and since it’s an unmoderated test, we needed to keep people engaged to complete the test. After testing was completed, I wrote up a “best practices” document that included incentive pricing, strategy for participants needed for certain tests, and when to use specific methods for testing. 

Step 1: Baseline Testing and Time on Task for Sign On Pages

Most of the previous attempts at testing the sign in portal were done using  UserZoom. I initially created 3 tasks that required users to complete using UserZoom technology to record their screen and provide feedback after completion:

1. Sign in or create an account

2. Change your profile picture

3. Add your software key

This helped the design team uncover specific usability issues with certain workflows. These issues could be quickly addressed within the next design sprint. 


One missing piece of information was around how people naturally interacted with portals for technologies such as Snagit and Camtasia. These were add-on experiences, meaning their participation wasn’t “make or break” in their use of the digital products. Knowing this, we didn’t know what type of information people would want to input into the portal, what kind of features they were looking for, and if they would link their social media accounts to the portal. 

When I started brainstorming with the lead designer for the website on testing and discovery tactics for this, we were thinking of ways to create a usability test that not only helped us fix key issues on the website, but helped gain insight into similar behaviors users have across other portals.

I decided to conduct a baseline, competitive test of four different product sign-in portals to see what type of social media they were using to sign in, what kind of actions they take once they sign in, and what kind of feelings or emotions participants had once they created an account or signed in.



Step 2: Usability Testing the Password Recovery System

Another unmoderated task that was implemented through the testing rounds was a password recovery step. One issue that TechSmith would see is that users avoided the password recovery system and instead would make multiple accounts if they forgot a password.  As we evaluated the data, we found out quickly that users had to jump through 3-4 different steps within the TechSmith website before sending their password recovery information to their email. Also, the social aspect made it more complicated because if they forgot their password to social media, TechSmith didn’t have the power to reset that. 


Testing the baseline sign-in experiences and having users complete tasks within the portal helped gain insight into common methods users have when signing into accounts. We noticed that participants would create an account using email, and only would use a social media specifically when the user knows of a benefit the account can use (i.e. using Facebook on Spotify to add friends onto their account through Facebook). This helped TechSmith cut down on options in the sign in portal from two options (Twitter, Google, email) to just email. In future iterations, we noticed a significant time change and positive feedback from users, even ones who had used the social media options in the past. I also passed along the password recovery data to the team who started exploring better and more efficient ways to recover a password. 

Step 3: Testing Help Pages 

After testing the main TechSmith homepage and sign in portal, we knew we needed to make Help content more findable and more in-line with the Portal experience. I lead card sorting, tree testing, click testing of the software key pages to help make a more streamlined and findable Help experience. 

Originally, TechSmith had two different sites for tutorial and help content. Both had never tested labels, workflows or hierarchy of tags and filters. When testing, the goal was to see the common workflows users went through to solve issues, which labels work best for the taxonomy and hierarchy of the help site, and where users prefer to go to solve specific product issues.

I first started the test by making a tree test to determine what the different help portals should be named on the main TechSmith.com homepage. In the test, participants were given the task of finding a certain piece of info  through the hierarchical structure presented to them that was based on the information architecture of the help site. Through this, we found specific information of where users think they should find crucial information and where we might have been off with specific labels. 


After the tree test was complete, participants were given task-based tasks that asked them to record their screen and then find an article based on an issue given to them in the prompt. I set up the test similarly to the sign in portal, in which we had multiple participants find certain articles through search, tab navigation, and a combination of both. When instructing them to utilize more tutorial-focused articles or help documentation, it helped us figure out why people relied on certain labels to help find articles or which type of workflow was most common. I also evaluated the navigation between pages and where people’s mouse go to first when looking at the page, which assisted the UX and UA team in clearing up a lot of issues of the site before redesign.


The main takeaways from the test were useful in finding out that when searching and looking at articles, we are training people to find information in a certain way, even in the briefest time period. That has a huge impact on how people browse, search, and look at content in Zendesk or a CMS. The common workflows and pain points were brought into design sprints and improved before we launched a beta experience of the new help system.



After the rounds were fully completed, I translated the data and presented in front of multiple teams of designers, UA professionals, developers, and project leaders. This required me to showcase the findings in a compelling way, but also allow for people who aren’t always user-oriented to start thinking from the user perspective after the report was shared. Pulling in the design team early also helped gain confidence within the team because empathy had been developing throughout the project lifecycle. 

After my findings were shared, the time on task improved significantly for users based on the four rounds of testing. Below, as you can see, a Microsoft UI was slowly introduced to users, which only allows them to sign-on via email. This was a crucial finding that was uncovered through the first round of baseline testing and the gradual decrease of time spent creating an account with TechSmith. 

We also saw a significant decrease in multiple account creation, since the workflow of recovering an account was re-addressed by the UA and UX team. 


The help documentation beta experience was implemented, and I closely worked with the UA team to gain feedback from the users and utilize findings from the tree test and unmoderated test to make feature improvements before the launch.

Throughout agile product life cycles, I have to think on my toes and utilize multi-methods so design and development teams can use the important insights to improve user’s experiences. Figuring out the ideal method to use, key insights to share, and user-focused design patterns and labels is something I am highly passionate about and will continue doing to make sure every service and product I work with is made with the user in mind.