Redesign of
Bloomberg Government’s
Digital Search Directory

Usability Testing & Surveys | 6 Weeks

Customers were dissatisfied with the search directory’s features and functionality, saying the product felt “clunkier” and less modern than those offered by direct competitors.

I led a 6 week usability study to evaluate mobile and desktop designs developed to modernize the product, with insights used to inform final design specs.

My role
UX Researcher

Key Methods
Usability testing
Survey

Timeline
6 weeks

Tools
Great Question
Qualtrics
Teams
Miro
Figma

I led all aspects of the project, from planning and recruitment to delivering final recommendations. I also collaborated with the product team to develop and implement a post-launch feedback survey.

Key responsibilities included:

  • Recruitment & Research Ops

  • Research design

  • Artifact Development (moderator guide, user flows)

  • Moderating & Facilitating Sessions

  • Analysis & Synthesis

  • Survey Development & Analysis

  • Findings & Recommendations

The Problem

Sales reported that customers were experiencing frustration as they searched for data on Members of Congress and other government officials using the search directory.

Informed by older research, our team had clear ideas on how to solve for these pain points, and wanted to test updated mobile and desktop designs with customers to evaluate usability and gather customer feedback on the new designs.

We also wanted to gather user reactions to the inclusion of census data, something Sales believed would increase customer satisfaction.

The Approach

Through usability testing on desktop and mobile prototypes:

  • determine if users can navigate between main and detail pages/screens.

  • understand if users find search results to be intuitive.

  • gather user reactions to census data added to key pages within the product.

  • Determine if users find the design to be user friendly and modern, especially on mobile.

Why Usability testing?

  • usability testing would allow me to answer the tactical questions

  • the prototypes included already tested designs, but had new elements that needed to be evaluated

  • usability testing would allow the team to see users directly interacting with designs

Methods & Timeline

Task-based Usability Testing

Participants were given a task, and asked to use the prototype to complete that task.

(e.g., “You want to know the name of a MOC’s Chief of Staff. Using this prototype, show me how you would find that information.”)

Mobile or Desktop First

Due to time constraints, the team decided (after much discussion) to show both mobile and desktop prototypes.

Participants were randomly shown the desktop or mobile version first, swapping the order across sessions.

Census Data and District Map

There was an open question about whether participants preferred to see census data at the top of the page / screen.

Participants were randomly shown the census data / district map in different areas, and gathered feedback on what users might prefer.

Image displays squares that detail timeline of project from planning to delivering recommendations.

To encourage real-time collaboration, I create a whiteboard for every project, and use it for project kickoffs, planning, synthesis/analysis, and presenting findings. The timeline image comes from the project’s Miro board.

Summary of Findings & Recommendations (with detailed example)

*

Post-launch User Feedback

After launch, I developed an in-product survey via Qualtrics to gather user feedback. I also observed and led Sales and client calls, and partnered with Product Analytics to understand user behavior and usage metrics.

Feedback Sources

  • Client emails

  • In-product survey

  • Usage analysis

  • Client interviews

Interviews

12

Surveys

103

Post-launch user feedback indicated some dissatisfaction with changes (that were not included in user testing), namely: hiding and removing filters to address information density concerns. I translated qualitative feedback into quantitative data points to determine opportunity areas.

I partnered with the Product Manager and Product Designer to analyze this feedback and determine solutions based on the product roadmap and developer capacity.

We decided to update the design to show filters on page load. We also re-introduced the staffers from every office and education filters, features used by (previously identified) power users.

Takeaways

More metrics.
Reflecting on the project, the study could have benefitted from clearer KPIs, success metrics, or analytics to understand if the new designs improved usage and/or customer satisfaction.

Potential strategy #1: Track usage metrics (DAU/MAU) and bounce rates post-release.

Potential strategy #2: Partner with Sales and Product to examine sales/contract losses and determine appropriate retention / contract KPI (e.g., x% of users retained post-launch).