Developing User Tests:
Testing CDC.gov Webpages in MS Forms & Qualtrics
My role:
User Researcher
Survey Developer
Challenge
Prior user research showed that many users found CDC.gov sites to be hard to read, scan, and navigate, causing confusion as they searched for information. CDC.gov contracted with my organization and team to rewrite all of CDC.gov as part of a site-wide modernization effort.
My team was responsible for testing rewritten pages to determine if General Public, Healthcare Provider and Public Health Professional audiences:
were able to find the main message (text) on the page,
understood the main message of the page,
knew what to do if there is a call to action on the page, and
considered the page to be user-friendly
Key Methods:
Survey Instrumentation and Development
User Interviews
Qualitative / Quantitative Analysis
Report Development and Delivery
Solution
To test messaging and content usability, I developed testing criteria and a survey in 3 parts:
Task completion and SEQ (Single-ease question) to test ability to find main message on page
Multiple-choice (single select) to test understanding of main message and call-to-action
Likert Scale questions to gather user ratings and reactions
Methods
Developing User Testing Criteria
Task Sucess and SEQ
During the development of the user research plan, my team and I collaborated on the creation of criteria to test messaging and content usability. In particular, CDC was interested in:
whether the newly developed "key points/summary" was easy to find on the page and
whether participants understood the main message/call-to-action.
My team and I determined the following would be appropriate for testing finding the main message:
Task completion - participants were asked to find the main message (key points/summary box) on the page. Success was rated pass/fail.
SEQ - Participants rated how easy it was to find the main message on a scale of 1 to 7, with 1 being very difficult and 7 being very easy. Scores greater than 5.3 were rated success.
Multiple Choice Single Select Questions
To test whether participants understood the main message and call-to-action, we included 2 multiple-choice, single-select questions:
MCQ1: What is this main message this page is trying to convey? (understand the main message)
MCQ2: What is this page asking you to do? (understand call-to-action)
Open-ended Question
During General Public unmoderated testing, we realized that because we were not gathering much qualitative data, we could not give recommendations as to why participants made certain selections when completing the survey. We also thought things like participant quotes would help provide valuable context to the unmoderated survey data. So, for Public Health Professional and Healthcare Provider unmoderated studies, we added an open-ended question: “What else would you like to tell us about this page?”
By adding this question we were able to gather participant quotes and other data points that would provide context and clarity to participant selections and thoughts.
Likert Scale Matrix
To test content usability for readability/scannability, length, audience appropriateness, we asked 5 questions on a 5-point Likert scale (see Public Health Professional example below). Scores rated > 3.5 were successful (see example Likert Scale below)
Surveys and Templates
I developed survey templates in Microsoft Forms and Qualtrics. Creating templates allowed my team to easily create several unique surveys each week.
Each survey contained 2-3 webpages for participants to test. Unmoderated studies contained task scenario questions, and the PHP and HCP surveys contained an additional open-ended question. Surveys were developed as follows:
Microsoft Forms - All general public surveys were developed via Microsoft Forms
Qualtrics - Public Health Professional and Healthcare Provider studies were conducted via Qualtrics, which were all unmoderated studies.
Data Analysis
My team and I analyzed the quantitative data in Microsoft Forms, and the Qualitative Data in mural. To score the Likert scale, we converted ratings to a point scale, and for each question, calculated the average score across all respondents. Scores higher than 3.5 were successful.
Results
We developed 29 surveys based on the testing criteria defined above:
12 surveys were developed in Qualtrics, for Healthcare Provider and Public Health Professional audiences
17 surveys were developed in Microsoft Forms, for Low E-Health Literacy, Mobile, and General Public audiences
We met our contract deliverable of testing a diverse set (diverse in content topic, content template, etc.) of quality pages for CDC's three primary audiences (general public, health care providers and public health professionals).
General Public Results
My team and I tested 80 pages with 223 general public website users.
85% of the general public users understood the main message. 89% of the general public users knew what action to take.
Overall, 59 general public pages or 74% were recommended to be published as part of the final site.
Health Care Providers (HCP) Results
ICF-RTI tested 23 pages with 72 health care provider website users.
93% of the health care provider users understood the main message.
87% of the health care provider users knew what action to take.
Overall, 17 health care provider pages or 74% were recommended to be published as part of the final site.
Public Health Professional (PHP) Results
ICF-RTI tested 12 pages with 36 public health professional website users.
61% of the public health professional users understood the main message.
78% of the public health professional users knew what action to take.
Overall, 3 public health professional pages or 25% were recommended to be published as part of the final site.