top of page
Cengage Learning:
Gale Support Site
UX RESEARCH
Enhancing accessibility for the target users of one of the world's largest education technology companies.

Timeline
January - April 2024
My Role
UX Research Consultant
The Team
Aurelia H., Aditi K., Jiani H., Lexi J., Ahn D.
PROBLEM
Teachers were not utilizing the site due to inefficient and frustrating navigation.
There are two primary user bases: librarians and teachers. While the nature of librarians' roles allows them more time to explore the site and access desired information, teachers have limited time and cannot dedicate significant efforts to locating materials.
PROCESS
1. Understanding Cengage
-
Interaction Map
-
Stakeholder Interview
-
Comparative Evaluation
2. Learning From Users
-
6 User Interviews
-
Survey
3. Analyzing the Site
-
Heuristic Evaluation
-
6 Usability Tests
4. Providing Solutions
-
Final report and presentation for Cengage
-
Summary of insights and solutions
Understanding Cengage
INTERACTION MAP
We mapped out the primary screens within Cengage's Gale Support Site via FigJam, considering expected popular user flows and essential features specific to our target users of interest: teachers.
STAKEHOLDER INTERVIEW
Interviewing two members of the support site team (the site coordinator and the developer), we identified key objectives:
3
2
1
Identify additional unmet needs and frustrations
Streamline the navigation process
Provide an organizational strategy for materials
COMPARATIVE ANALYSIS

We conducted an evaluation of popular sites' key features, analyzing organizational and navigation strategies used by other organizations, and presented our findings to stakeholders.
Understanding Users
USER INTERVIEWS
83% of users found navigation challenging initially, while the remaining percentage reported improved ease with time and familiarity.
We conducted six interviews with current users, aligning with the key goals derived from our research of Cengage and stakeholders. This involved creating interview protocols, recording, and transcribing sessions. By identifying commonalities in participants' experiences with the Support Site, we generated an affinity map.
INSIGHTS

From these interviews, we identified three main pain points that aligned with our stakeholders' hypothesis. Additionally, we pinpointed three crucial behaviors that shed light on both commonly and infrequently used aspects of the site.
SURVEY DESIGN
Due to time constraints, our team was not able to conduct the survey. Instead, we distributed a pilot survey to three users, incorporating their feedback to streamline our questioning, ensuring clear results and reducing confusion.
The survey includes qualitative and quantitative data collection points across three main sections: Background, Improvements and Suggestions, and Specific Experiences with the Gale Support site.

Analyzing the Site
HEURISTIC EVALUATION
Our heuristic evaluation uncovered violations of 7 heuristics.
Each team member conducted an independent heuristic evaluation of the Support Site. We aggregated the results to identify the most significant risks posed by violations and the most commonly occurring heuristic violations. Using this information, we determined the violation/ pain point and the heuristic violated.
1
Inconsistent page titles/navigation elements
Heuristic Violated: Recognition Rather than Recall + Consistency and Standards heuristics
2
Absence of search bar features & back arrows
Heuristic Violated: User Control and Freedom heuristic.
3
Selectable links in sub-page headings & copy-able text lack clear affordance
Heuristic Violated: Recognition Rather than Recall heuristic
4
Selection indication varies by input method
Heuristic Violated: Visibility of System Status + Accessibility heuristics
5
Overwhelming hover box content, unclear icon labels, small text, unclear imagery
Heuristic Violated: Aesthetic & Minimalist Design heuristic
USABILITY TEST
The overall success rate for completing all three tasks among participants was approximately 53%
We designed three tasks and a short survey for our 5 participants to complete, each chosen to assess various sections and interactions on the support site based on information gathered in our previous research initiative. We conducted a pilot test prior to our 5 usability tests, where changes where made to streamline the protocol.

We consolidated our findings and identified pain points users encountered across each task. From our results, we pinpointed four main areas of confusion for users:
1
Prefer several methods to search for information
2
Users were confused by the current labeling
3
Participants experienced information overload
4
Important navigation items and shortcuts are not prominently displayed
In addition to the three tasks, all five usability tests conducted included users filling out a short survey. Below are several important findings:
60% of users rated the site as 3 or above on a complexity scale of 1 to 5.
40% of users felt confident using the site, while the remaining 60% felt unsure.
SOLUTIONS
Drawing from our comprehensive research, we categorized our findings into three key areas and developed specific, actionable recommendations to enhance site usability.
FINDINGS
RECOMMENDATIONS
1
Current navigation is unclear to users
-
Streamline title and label consistency throughout the site.
-
Adding a navigation path & back arrow icons to aid users in understanding their position.
-
Increasing visual separation to differentiate selectable headings from static ones.
2
Overwhelming amount of uncategorized/ unorganized information
-
Categorize content into clearly defined, broad sections based on specific user tasks
-
Implement collapsible sections for information dense categories
-
Incorporate illustrative images when possible to reduce text
3
Users seek more efficient and varied methods to search for information
-
Add search bar to information-dense pages, reducing the need for scrolling.
-
Align navigation elements (filtering options) in accordance with standard usability practices.
-
Categorize content into overarching collapsable sections.
FINAL HANDOFF
At the end, our efforts were well-received by the Cengage Learning teams, and we successfully transferred our work to be used for future development.
LESSONS LEARNED
Things I would do differently next time:
Incorporate a greater sample of short-time users.
Our participants in both the interviews and usability testing were primarily long-time users of Cengage. While we identified common pain points for both long-term and limited short-term users, further research could provide deeper insights into the experiences of new users.
Balance business goals with user needs in the beginning.
Initially, our recommendations prioritized user needs without considering the complexity of implementation. However, through ongoing weekly meetings, we balanced enhancing usability for users with considering the development team's time and resources.
bottom of page