Lessons learnt from user testing our beta product
Usability testing is a way to evaluate a product or product feature by testing it on users. We do this by getting a group of users to complete specific tasks within a defined length of time. It allows us to experience the product from the visitors’ perspective, then identify opportunities to improve. If several users have similar difficulties during the test, that tells us that changes should be made to improve user experience
How did we do it?
The nsw.gov.au alpha product was built in just five weeks, to illustrate our content strategy research, present design concepts, and build a prototype of the website. Over the following months we further developed the website into a beta product. The beta consolidated information from Births, Deaths & Marriages, the Department of Customer Service and over 2,000 pages migrated from the old nsw.gov.au website. Due to time constraints, we knew that the development of the information architecture, navigation and design would have to be a continuous and iterative process. In doing this we needed to understand from users if the new beta version was useable, and what areas we should focus on to improve.
We had a month to test the beta before launch, to use the time as best we could, we ran four tests at the same time as working on other parts of the beta.
The objective was to make it easy for people to find information on the new website. In the test, we measured how well users could complete these tasks:
- use the home page
- understand, use and navigate the site to include the main navigation, cards or the footer
- use the search functionality and keywords
- understand the purpose and the scope of the site
- trust the site and relate to the design.
To make sure that the people involved in testing represented different users, we recruited participants from culturally and linguistically diverse backgrounds, and a broad age and socio-economic range, to include those:
- between 18-70 years of age
- from a range of marital and dependency statuses
- students, business owners, unemployed, full-time employed and retirees.
Google Analytics data between July and December 2019 showed that 50% of our users were on mobile, 43% on desktop and 6% on tablet. We made sure to test that the beta looked and worked right on both mobile and desktop.
Per best practice the person running the test should be as unbiased as possible. To support this we outsourced the testing sessions to an external agency. Our team had the opportunity to observe each session to gain as much knowledge and understanding of the user's experience as possible. Two types of tests ensured that we made the best use of the time we had. We conducted both moderated one on one sessions as well as remote user testing with 40 participants.
Outcome
Seeing the user in action on the site for the first time is always exciting for a design team. In general, the site performed well, and the users told us they were happy with the experience. Importantly, they found it reasonably easy to find the information and complete their tasks. They told us that they trusted the site and the language was easy to understand. On average, the users rated the product 8 on a scale from 1 to 10 when asked how they perceived the experience.
We noticed a difference in user behaviour on desktop compared to mobile when searching for information on the site:
- 56% of mobile users started with clicking on the hamburger menu, and 48% of desktop users began with the main top navigation.
- 35% of mobile users and 24% of desktop users started with scanning the page but would also use the navigation.
- 24% of mobile users and 9% of desktop users never used the top navigation at all. Instead, they found information by using the search functionality or by scanning the page and clicking on links.
Labelling of navigation and creating ways for the users to find related information were areas for improvement.
Next steps
People’s circumstances change, and so does the way they use the site. We need to conduct user testing on a continuous basis to ensure nsw.gov.au continues to be customer-centric as it grows. We continually measure the user experience by observing people's behaviour in context using analytics, heatmaps, and running A/B tests and surveys. We also want and need to conduct more interviews and one on one sessions to validate our findings from the usage analysis and data.