User acceptance testing (UAT) is a process that aims to validate a solution with its business users before it is released. Although UAT may appear to be a straightforward exercise, it comes with many challenges that can be difficult to address at the end of a delivery project. Here are our top 5 key learnings from a recent project which involved user acceptance testing.
It was critical to engage the right users to represent the different roles and teams in the organisation so that we could receive balanced feedback from multiple perspectives. We began by engaging the stakeholders who were initially involved in designing the new process or attended the requirements design workshops. We then invited additional users to take part in the UAT and eventually selected 50 users from the various business units, projects and locations across Australia, New Zealand and Singapore. All users were assigned test cases relevant to their field of expertise, their group function and location.
To generate the UAT plan, we initially intended to reuse the test cases that were created for the system integration testing. However, we discovered that the format of the test cases only allowed us to verify that the system worked according to the functional requirements, not exactly real-life scenarios. Most of the cases were too granular, covered over-extensive circumstances and were written with complex technical language.
We worked to shift the testing mindset of the team from “we should test the entire system like previously and find zero bugs” to “testing should occur by users performing their job in the new process to identify critical areas of concern to be addressed”. We then designed a new set of test scripts for UAT that would guide users through the new business process. These were written in an easy to understand manner, they focused on the positive scenarios only and covered real-life user scenarios. Users were able to validate the process and the supporting software, share their feedback with the team and grow confidence that the new solution would enhance their day-to-day work.
The product champions who volunteered to test the system had to allocate their time to this exercise, learn how to follow the test scripts, mark them as pass/fail and raise issues they noticed during testing. As the volunteers were time-sensitive, we prioritised and determined the most frequent high-risk scenarios for the UAT plan.
We then ran training sessions for the volunteers so they could learn Azure DevOps test management tool and the sandbox environment. We also conducted training and demonstration sessions to explain the new paperless process to the participants and showcased the software solution to be tested. This ensured they had the right skills to complete the tasks required for UAT.
Due to the restrictions introduced in Melbourne to fight the COVID-19 spread, we had to facilitate the UAT sessions remotely. Each day, we held a 30-minute call in the morning and afternoon with all of the UAT testers to explain the scope of testing for that day, share updates and answer questions. We established a support channel for all UAT related items on Microsoft Teams and assigned shifts among ourselves to assist the business users with any UAT-related issues. The team also created a UAT dashboard in Azure DevOps with progress indicators and to track matters raised. These additions allowed everyone to stay connected during the UAT phase and business users felt supported and engaged throughout the process.
The feedback received during UAT varied. It contained genuine defects (bugs), proposed improvements (change requests), inquiries about the new business process and system technicalities. Each day the feedback was triaged then raised with the product owner and technical team. Bugs were prioritised based on their impact vs. the cost of fixing and either assigned for fixing before the release or moved to backlog for future releases. Change requests were also prioritised based on the user benefits, frequency of usage, risk of not developing and the potential effort and risk involved in their implementation. Change requests were either assigned for development or moved to the product backlog for future considerations. We also addressed the questions on the business process or how the new system worked individually and shared links to the relevant user documentation or training videos. We discussed frequently asked questions in daily UAT calls. Once issues were resolved, they were assigned back to the reporter for retesting to confirm that they worked as expected.
Overall, the UAT was a triumph. We captured comprehensive and meaningful feedback from the business users and identified the most critical areas of improvement that would maximise value before the release. Which of these challenges have you faced? And how did you address them?
Fill out your details and one of our experts will be in touch.