top of page

Usability Testing
MA Probate Court (No-Fault Divorce)


UX Researcher

(Team of 5)




Microsoft Office



Heuristic Evaluation

Usability Testing

Data Analysis



4 months

(Jan 2022 - May 2022)


The study was a formal attempt at evaluating a subset of the divorce filing process for the Commonwealth of Massachusetts, and it was completed on the behalf of the Massachusetts Probate and Family Court Department. It specifically looked to analyze and evaluate the usability test results for the No-Fault 1B Divorce Form and the process of navigating to it from the webpage outlining the No-Fault divorce process.


The client had received feedback from users and noticed a high error rate. Additionally, users often had questions about completing the form and had trouble finding the information necessary to complete it.


Primary goals for this project include the following:

  • Determine problems users face when identifying the necessary forms for the process.

  • Identify which questions on the form users have difficulty completing.

  • Understand why users have difficulty completing those questions.

  • Understand how users currently try to overcome problems in completing forms.

  • Make recommendations for improving the process, website, and forms.​


We conducted an expert review and a usability test of the form and the MA Court webpage outlining the divorce process to determine specific issues and make actionable recommendations. The expert review and usability testing objectives were determined based on discussions with the client sponsor.


The research helped identify weaknesses and suggest improvements to the filling of the form, as well as the process related to the filling of the form. The details, enumerated in the report and the presentation, are being leveraged to convey pain points to the practitioners and other administrators in the court and to discuss and decide on certain aspects- such as the requirement of the Military Affidavit. As a consequence of the research, the MA Probate Court is likely to form a committee to review the case and fine-tune solutions based on the findings and the recommendations; solutions which are likely to have positive ramifications for the people using the forms.


To support the project goals, we framed the following research questions:

  1. What is the success rate for identifying the correct forms?

  2. What is the error rate for completing the Complaint for Divorce form?

    • Which questions get the most errors?

    • What terms are the least understood?

  3. What is the user’s confidence that they completed the form correctly?

  4. What is the user’s satisfaction with completing the form?

    • What are the drivers of satisfaction and dissatisfaction?

  5. What is the user’s satisfaction with locating the correct forms for the divorce process?

    • What are the drivers of satisfaction and dissatisfaction?

All testing goals were chosen based on considerations of the client’s key business goals for the initiative:

  • Reduce user anxiety during the process

  • Reduce time waiting in line

  • Reduce errors during the divorce process

  • Reduce the time to completion of the divorce process

  • Reduce the number of questions related to the form completion/filing

  • Reduce time on the searching website for forms

Heuristic Evaluation

The goal of the expert review was to do a heuristic evaluation of the No-Fault 1B Divorce Form and the process of navigating to it from the 1B Divorce page on The focus of this review was determined by the tasks identified for the usability testing. Jakob Nielsen’s 10 Usability Heuristics for User Interface Design was used for the review of the web pages that navigated to the 1B Divorce form. Adobe Best Practices for Form Design by Nick Babich was used to analyze the form itself against accepted form design principles.

The Dumas & Redish severity scale was used for both sets of heuristics. An expert review was initially conducted by each individual researcher on the team and then the findings were combined to develop a group expert review.

The findings were organized as positive as well as negative findings. The negative findings were prioritized based on their severity ratings with recommendations included under each negative finding.

These findings gave an insight into the areas of the form and the webpage that needed to be focused upon for improvement and established a base for the usability test.

Participant Screening
And Outreach

The team prepared a screener to identify and recruit desired participants. Social networks, as well as professional networks such as LinkedIn, were used as channels to reach out to the target groups. The screener consisted of an online survey built using Qualtrics.

Users whose responses aligned with the criteria set for the study were invited to participate in the usability test. They were communicated via email and prior consent was taken by requesting the participants to sign a consent form. The team recruited a total of 11 participants including 1 pilot and 10 users. The participants consisted of 5 males and 5 males spanning the age groups 25-34, 35-44, 45-55, and 55+. All the participants were highly educated and proficient in English. Out of 11, only 1 participant had been divorced at least once earlier.

Task List And
Moderator's Guide

Preparing the task list was a quintessential phase of the project and underwent a few iterations. We had to ensure that the user journeys through the webpage and the form in such a way that it helps us address all the project goals. The tasks were therefore chronological in nature. We also constructed a mock scenario that will put the users in a pre-defined setup and help them move through the tasks.

The moderator’s guide was written to provide a script for the session to help us moderate the participant sessions. It included an introduction to the study, briefing information for the participants, background interview questions, protocols for the sessions, the mock scenario, and the corresponding tasks, followed by the post-test debriefing.

Usability Testing

Including the pilot test, a total of 11 usability sessions were conducted. Every session was led by a moderator through their own computers while the participants used their own computers. Testing was conducted over Zoom and recorded. We used a Qualtrics survey to collect quantitative data on satisfaction and confidence.

Thinking out loud was demonstrated to the participants during the briefing stage and the participants were encouraged to do so as they move through the tasks concerning the webpage. However, the participants were requested to not think out loud, or ask questions, and fill out the Complaint for Divorce form as they would in a real-world setup. They were free to take help of any other resources as they would normally do while facing the task in a real-world setup. This was to facilitate the authenticity of user activity and data. User behavior, time on task, and issues or highlights were thoroughly noted by the observer/s throughout the usability sessions. There was no counterbalancing involved in task order to avoid exposure/learning bias.

Upon completion of all the tasks, the participants were asked some questions during post-test debriefing to gather feedback. We used a Qualtrics survey to collect quantitative data on satisfaction and confidence.

Data Analysis

We synthesized the observations noted during the sessions and pulled the gathered quantitative and qualitative data to put together our data set. We analyzed this data to derive insights and developed recommendations based on our findings. These were categorized based on the webpage containing the form and filling the form itself.

We synthesized the observations noted during the sessions and pulled the gathered quantitative and qualitative data to put together our data set. We analyzed this data to derive insights and developed recommendations based on our findings. These were categorized based on the webpage containing the form and filling the form itself.

"The whole thing confused me"

"It is a lot to take in"

The broad level themes which emerged for the findings/ issues were related to inconsistent labeling and unclear copy, legal jargon, lack of help/ context, and confusing field controls. The topmost recommendations involve communicating to the user via clear and jargon-free copywriting that the user can digest and act accordingly, readily providing help regarding legal terms and their implications, providing clear sections on the webpage listing the required forms based on conditions, providing appropriate controls on the form, especially for date inputs and to provide validation checks for the input fields, to provide visual feedback such as an indicator that notifies the user while saving the form and to provide the ability to edit the form contents after saving the form.

The usability test report was organized based on which findings were validated and whether there were any new findings. The impact of a recommendation was based on the severity of its corresponding finding and accordingly, the recommendations were prioritized. Higher priority was dedicated to the issues and recommendations related to filling the form since they directly contributed to complete task failure and/ or severe user frustration.


To emphasize the importance of the findings and recommendations, we decided to run a benchmark test towards the two important metrics for our case- 1. task completion rating for finding the form on the website and 2. task completion rating for filling the form correctly,

and include the results in our presentation.

We chose the benchmark value to be 78% based on a study that involved an analysis of almost 1200 usability tasks by Jeff Sauro. For the task of finding the form on the website, we only considered being able to find all the forms as a success since any missing form would lead to a second attempt and failure to apply currently. Likewise, for the task of filling the form correctly, we only considered being able to fill out all the details correctly a success.

The benchmark comparison values provided a clear indication of the current design’s performance, validated through data. This helped with understanding the implications of the quantitative data better and give an additional perspective into the findings and the recommendations.

Client Presentation

As end-of-the-study deliverables, we compiled a report of our findings and prepared a presentation to communicate our findings and recommendations to the MA Court staff.


Limitations &

  • There are demographic segments that were underrepresented or not captured in this study, namely people who are not fluent in English and people who do not have a post-secondary education.

  • Additionally, it proved a challenge to find participants who had been through a divorce (at least once) earlier. Out of our 10 participants, only 1 participant fulfilled this criterion. The primary limitation, thus, was finding a high number of participants. especially those that belonged to a specific demographic setup.

Next Steps

  • Since some demographics were underrepresented or not captured, we recommend additional usability testing or a qualitative study with the Help Center to capture the problems encountered by these groups.

  • Additionally, conducting a SUS usability survey or examining data in site analytics tools may reveal additional information about the site.


  • It can be difficult to describe the task without using the words present in the labels. However, care has to be taken to try and do so, in order to avoid misinterpretation of the meaning of the task and leading users.

  • Non-significant results make it hard to make recommendations

  • It is essential to make requiring a response to each question compulsory to ensure the integrity of the results

bottom of page