Rapid Research and Reporting:
Weekly Unmoderated/Moderated User Testing
My role:
User Researcher
Operations Lead
Challenge
Prior user research showed that many users found CDC.gov sites to be hard to read, scan, and navigate, causing confusion as they searched for information. These frustrations were exacerbated by the Covid-19 pandemic.
In light of these findings, CDC.gov contracted with my organization and team to rewrite all of CDC.gov as part of a site-wide modernization effort, including standardizing the design of its sites using a new webpage editor, rewriting content to improve users’ understanding of the content’s main message, improve readability and scannability of each page, and integrating plain language across the sites.
To meet the contract objectives to complete all studies by March 2024 and deliver testing reports quickly, my team needed to develop a rapid testing and reporting strategy that allowed us to test and deliver reports for multiple studies each week.
Key Methods:
Survey Instrumentation and Development
User Interviews
Progress Tracking in Jira
Qualitative / Quantitative Analysis
Report Delivery
Solution
The UXR Lead and I developed a rapid testing strategy to test on average 2-5 studies in a week, with reports/findings going to CDC over the next 1-2 weeks. The process included:
determining which webpages to be included in a study (each study consisted of 2-3 webpages to test),
developing the weekly testing and reporting schedule,
developing a kanban board to track the development and staging of tests,
collaborating with 2 research assistants to build and stage tests in Microsoft Forms and Qualtrics,
running 10 moderated and 29 unmoderated tests, and
analyzing data and presenting findings and recommendations to executive leadership
Process
Planning and Scheduling
This image shows the rapid testing schedule for testing with the general public. (Weekly testing also occurred with Public Health Professional and Healthcare Provider audiences.)
Moderated tests (Low E-Health Literacy and Mobile) are in blue and white, and unmoderated studies are in orange. Reports are in purple. On average, my team and I conducted up to 4 unmoderated tests, and 2 moderated studies a week, with 6-9 participants per study.
We originally planned to begin testing in October 2023, with testing complete in January 2024. However, some writers did not have full access to the Digital First Editor on CDC's network, preventing them from conducting any rewrites, which delayed the start of user testing. Coupled with the delay in selecting a platform, we needed to add 2-3 additional studies per week, as well as adjust our schedule to end testing in February 2024.
Even with this adjusted timeline, we were still able to successfully meet the project deadline.
Jira Ticket Development - Tracking Work
Each study used the ticket template below, with 3 subtasks: build and stage study, run the study, and write the study report. The research assistant was responsible for initially building out the test, and the UXRs were responsible for reviewing and finalizing the tests, running the tests (with research assistants as notetakers), and working with the Research Assistant to analyze the data and write reports to be delivered to CDC.
Running Tests and Delivering Reports
Test Setup and Weekly Testing happened in 3 parts:
Building Studies
Running Studies
Report Building
Building Studies
Each week, UXRs would select pages to be included in user testing that week. Once pages were selected, they were grouped into a study (3 pages per study) and added to the testing roster for the upcoming week. The UXR would build the test questions for each study and assign the ticket to a research assistant who would move the information to Microsoft Forms. UXRs would then review and finalize the study, and stage for testing. Studies staged for testing would be included in the research for that week.
Running Studies
UXRs conducted moderated studies (synchronous user testing with participants in real time). At the same time, unmoderated studies were released to participants, who had up to a week to complete. UXRs often completed 2-3 30min-1h long moderated studies a day to meet the requirement of 6-9 participants per study.
Report Building
During the same week that studies were being built/run, a research assistant moved the prior week's study/test results to Excel and did an initial analysis of the data. UXRs would then fully examine the data and develop final findings and recommendations typically within 1 week.
Results
My team and I tested 116 pages (with 300+ users) from the general public, public health professionals, and healthcare provider audiences, for a total of 39 studies over 15 weeks. My team and I also delivered 19 reports:
5 Mobile, 4 Low E-Health Literacy, and 4 General Public user reports
2 Healthcare Provider user reports
2 Public Health Professional user reports
2 summary (Final) reports with highlights, trends, and additional guidance synthesized from the General Public, Public Health Professional and Healthcare Provider studies
My team and I met the contract deliverable of testing a diverse set (diverse in content topic, content template, etc.) of quality pages for CDC's three primary audiences (general public, health care providers and public health professionals).