Web testing for TechSmith


I worked with multiple teams across TechSmith to help improve and streamline online, customer-facing content, such as the sign in portal for TechSmith.com and the Zendesk help documentation system. I had to utilize multiple testing methods for our unmoderated tests, where I created tasks that included task-based usability tests, card sorting, tree testing, hotspot click testing, and much more.

Sign in Portal testing overview


As a user researcher at TechSmith, I work heavily with customer-facing web content to make sure TechSmith’s websites, whether it be for marketing, help, or other services, fits the customer needs and allows them to reach their goals in the least amount of time and the least amount of hassle.

I worked on multiple testing projects to help the web and UA teams create websites and portals that fit both business goals and user goals. For instance, I helped the Identity team to streamline and improve the sign in process on signin.techsmith.com. As a business, TechSmith wanted to be able to have users who own their software sign in the TechSmith portal using an email or social account so they can have all of their keys and accounts in one space. To do so, the sign on and account creation portal was created and linked from the homepage on TechSmith.com. But, after multiple testing iterations, there was still a cloud of confusion about what kind of sign in options users needed, what content should show up first, and what kind of information (software keys, profile pictures, and extra contact information) users want to add.


I helped research similiar sign on portals by testing them as a baseline, learning more about sign in behaviors, and evaluating times, methods, and user feedback in comparison with the current TechSmith sign on portal to create a better digital identity for TechSmith.

Research methods

Most of the previous attempts at testing the sign in portal were done using typical unmoderated usability testing using UserZoom. We initially created about 2-3 tasks: 1. Sign in or create an account 2. change your profile picture 3. add your software key. This helped us learn that there were specific usability issues with certain workflows, but not really about how people decide on a certain workflow (social media, password recovery, etc), how they interact with portals naturally, and what kind of information they want to share with the company.


When brainstorming with the lead designer for the website, we were thinking of ways to create a usability test that not only helped us fix key issues on the website, but as we noted the conflicting conversations of what a good sign in portal looked like.

I decided to conduct a baseline, competitive test of four different product sign-in portals to see what type of social media they were using to sign in, what kind of actions they take once they sign in, and what kind of feelings participants had once they created an account or signed in.


We learned a lot of useful information, such as the common methods people used when signing into accounts. Most of the time, we noticed that participants would create an account using email, and only would use a social media specifically when the user knows of a benefit the account can use (i.e. using Facebook on Spotify to add friends onto their account through Facebook). This helped TechSmith cut down on options in the sign in portal from two options (twitter, google, email) to just email. In future iterations, we noticed a significant time change and positive feedback from users, even ones who had used the social media options in the past.


We also were able to look at password recovery and evaluate the differences in design and how it impacted people creating new accounts instead of recovering their password versus just using the password recovery. In the newest design, the process was streamlined and much of the data used from the baseline test on how people want their recovery system to work. Also, we iterated and tested the ability to change specific information, add keys, and edit profile pictures inside of the portal.



Future tests and Zendesk help portal

After testing the main TechSmith homepage and sign in portal, I eventually lead more user-focused web projects, such as card sorting and tree testing of the TechSmith.com navigation, click testing of the software key pages, and working with the UA team to make help content more useful and find-able in the help portal. Originally, TechSmith had two different sites for tutorial and help content, plus many labels that had gone untested for usefulness, find-ability, and much more. We wanted to see what kind of content are people interacting with, using better labels (tutorials vs. help documents) and decide whether the split between the tutorial portal vs. help portal made information more consumable, or frustrating for participants.

I first started the test by making a tree test to determine what the different help portals should be named on the main TechSmith.com homepage. A tree test is a research method for evaluating the findability of website content. Participants are given an information structure and are given the task of finding a certain piece of info (the navigation and findability of “Help & Support”). Through this, we found specific information of why users went to another Snagit-related link versus another.


The remaining tasks were task-based tasks that had participants find an article based on an issue given to them or a certain task they want to learn in Snagit. I set up the test similarly to the sign in portal, in which we had multiple participants find certain articles through search, tab navigation, and a combination of both. When instructing them to utilize more tutorial-focused articles or help documentation, it helped us figure out what people viewed certain articles as what, and why they geared towards one type of help portal or another.


The main takeaways from the test were useful in finding out that when searching and looking at articles, we are training people to find information in a certain way, even in the briefest time period. That has a huge impact on how people browse, search, and look at content in Zendesk or CMS. I also evaluated the navigation between pages and where people’s eyes go to first when looking at the page, which assisted the UA team in clearing up a lot of issues of the site before redesign before we launched a beta using the site as a way of on-boarding.


Improving the feedback process

During every round of the four round usability test, I created screeners for recruiting, introductory instructions (which included videos, examples of proper use, and info-graphics). This was due to a lot of drop-off we noticed, since we usually recruit from various pools of customers and non-customers and since it’s an unmoderated test, we needed to keep people engaged to complete the test. I also helped brainstormed and experimented with incentives and helped create a better incentive structure for the design team using UserZoom for their own tests.


Every day, I interact with projects like these. I need to quickly create, iterate, and present usability tests and utilize multi-methods so our design and development teams can take our research and use them in multiple ways, and multiple project. I learned not only useful unmoderated testing methods, but how to take a ton of data and numbers and find useful patterns through sorting, tagging, and text analysis. I had to present in front of multiple teams of designers, UA professionals, developers, and project leaders and clearly communicate my findings in writing and verbal communication methods, which helped me learn a lot about communicating and advocating for users, but building a professional confidence in my presentations and written communications.