Interviews
Usability testing
Surveys
Microsoft Teams
Qualtrics
ChatGPT
UX Researcher and Designer
Jan - May 2023
Over the years multiple online courses have been added to the website's course catalog making the overall look and functionality cumbersome. What started as a few tiles displaying the available courses, has blossomed into a laundry list of many tiles for the user to choose from at their discretion.
This project aims to explore more user-friendly ways to present the course catalog contents. Specifically, the focus is on enhancing the display of both the full course catalog and individual course tiles to improve user experience. The images below represent the study area of focus.
Individual Course Tile
The sole data source that had been recorded regarding the usability of the course catalog consisted of help tickets submitted to the technical support section of the site. Although these tickets highlighted challenges, it remained uncertain whether these issues reflected the experiences of the majority of users.
A survey was created to assess the current usability of the course catalog section of the site. It was sent out to 21,190 users. 1005 people responded.
The Baseline Survey focused on the following questions:
development?
colleagues?
Participants were given a Likert Scale of 1 to 5 for response options(Always, Usually, About half of the time, Rarely, Never). Additionally, the participants were encouraged to leave comments regarding their course catalog experiences.
In the participant feedback, all questions received consistently high scores. Among the questions asked listed above, five had a mode score of 5, indicating "Always," which is exceptional. This suggests that the course catalog excelled in many aspects. However, one question, which assessed the ease of navigation on the site, received a mode score of 4, indicating "Usually". While still a commendable score, it indicated an area for potential improvement in navigation.
The table presented below illustrates the survey outcomes for the quantitative inquiries. It's evident that the mean, median, and mode consistently yield high scores with minimal standard deviation.
In addition to the usability questions, a Net Promoter Score (NPS) question was integrated into the inquiry process. The results revealed that 59% of respondents rated the course catalog at a 9 or 10, classifying them as promoters. This statistic is noteworthy as it indicates not only a positive sentiment towards the course catalog, but also an active advocacy for it among these individuals. Their high ratings signify a level of loyalty and enthusiasm, suggesting a strong endorsement of the course catalog to others.
Participants were encouraged to provide both positive feedback and constructive criticism to identify specific pain points and potential enhancements for the site. I utilized an affinity diagram to categorize the comments, revealing key areas of concern, including technical issues, complaints, suggestions for course content, and navigation challenges.
Noteworthy navigation-related issues included difficulty guiding others through the site, perceived crowding, inadequate communication regarding navigation, challenges with progress reports, and spending excessive time searching for needed information.
Reflecting on these insights, while the course catalog received commendations, there were evident opportunities to delve deeper into addressing navigation issues and enhancing the overall user experience of the site.
While the participants provided high ratings for the site, occasional help ticket complaints along with some of the comments collected in the Baseline Survey indicated areas where usability enhancements could be beneficial. Recognizing the continuous need for improvement, I initiated efforts to identify potential usability enhancements. To gather insights, I reached out to current users and recruited them for interviews.
6 participants were interviewed for the course catalog focusing on both the visual and functional aspects.
Each interview session focused on:
Each interviewee was observed for:
you need to take next for your position"
The participants expressed positive sentiments regarding the overall look, feel, and usability of the course catalog. Their pleasant expressions and comments highlighted the ease of use of this section within the site.
However, despite the positive sentiments, interviews revealed a significant discrepancy between what users reported and what they actually experienced. While participants claimed to have an easy time navigating the course catalog to find the courses they needed, they encountered challenges when asked to locate specific site components. This led to visible signs of struggle, with participants eventually completing the tasks only after several attempts and trial and error.
In both the Baseline Survey and interviews, participants provided overwhelmingly positive feedback. I combed through the qualitative data to look for patterns.
Among the six participants, four consistently used the term "old eyes" when discussing aspects of the course catalog. This phrase recurred throughout their feedback, both in discussions about the overall catalog and in specific areas. Notably, the area displaying the targeted audience on course tiles elicited multiple mentions of "old eyes."
Although users generally perceived the course catalog as high functioning, insights from the Baseline Survey and the interviews suggested that there was room for improvement in usability. By analyzing qualitative and qualitative data findings, I explored the potential for improvements to the course catalog and individual course tiles through surveys to enhance usability on the site.
A survey was distributed, focusing on both the course catalog and the individual course tiles.
The course catalog survey was sent out to 4,762 current users. 176 people responded.
The survey focused on:
Many actionable insights were derived from the accumulation of the previous data combined with the results of the course catalog survey. Below are the findings focusing on both the course catalog as a whole as well as the individual course tiles.
As noted earlier, the vast majority of users reported that the course catalog was easy to use. However, when digging a bit deeper opportunities for an improved user experience could be uncovered.
Comparing other course catalogs as well as designs found within menus for various types of sites, survey participants were polled to learn if there were any types of enhancements they believed they would find useful when locating a course.
Users were given the options of adding a search feature, the ability to sort classes, changing the display to offer a list format instead of the tile format the courses were shown in, or if simply using different fonts and colors would allow for easier access when locating specific courses on the tiles.
The participant responses can be seen below:
These results indicate that the preferred method for locating courses would be to incorporate a search feature. Furthermore, based on these findings, I concluded that the current tile format view is preferred over a list-style view that was provided as an alternative option.
Participants were surveyed to determine their preferences regarding the information displayed on the course tile. I analyzed every component of the course tiles, aiming to understand its significance to users and how it supported their tasks. I investigated which fields users relied on to locate courses and identified potential additional fields. I assessed the visual appeal and readability of the tile, ensuring users could comprehend all information displayed. I also explored whether there were alternative designs that could enhance the appearance and effectiveness of the course catalog tile.
Participants were presented with ten different options for the type of information they would like to see on a course tile. They were instructed to select their top three preferences. The chart below illustrates the ranking of these options based on participants' selections.
The course description was reported as the most important aspect of the course tile. The words should adequately describe the course as well as be easy to read.
Survey participants selected the course title as the second most important factor listed on a tile. The current titles pose multiple issues. Several course titles exhibit similar wording, creating challenges in distinguishing one course from another. Merely arranging the course names alphabetically proved insufficient, especially since many participants were unaware of this arrangement. During interviews users encountered difficulties in locating specific courses. Users expressed a need for additional methods to enhance accuracy and speed in finding classes.
The section of the course tile dedicated to the targeted audience presented two primary issues. Firstly, its visual design - the black background with white lettering proved challenging to read, leading to comments about "old eyes" during earlier interviews. This sentiment was further confirmed by the survey, where 60% of respondents found this area useful, while 35% admitted to never noticing it on the course tile at all.
Additionally, the targeted audience names currently offer 23 different options, complicating matters for users. Instead of a few straightforward categories, users are now faced with numerous possibilities, making it difficult to identify the most relevant audience for each course. Further investigation revealed that there was no standardized process for determining these targeted audience names within the organization. Each team had the autonomy to create their own names and classifications without seeking consensus from other departments. Consequently, different teams may have conflicting ways of categorizing courses, leading to user confusion.
Overall, these findings indicate several areas within the targeted audience section that require improvement for enhancing the user experience. In the chart provided above, the phrase "targeted audience" is represented as "which roles the course is applicable to," which was used during participant surveys.
The current course tiles indicate the number of classes within each course, yet feedback from user interviews underscored the significance of knowing the total number of hours rather than just the number of classes. Users expressed that the number of classes felt arbitrary and did not adequately convey the time commitment required for the courses.
Subsequently, surveys were conducted to compare users' preferences between the number of classes and the total time in hours. Results revealed a clear preference for knowing the total hours, as this information was deemed essential for managing time effectively.
Utilizing the insights mentioned earlier, I developed these prototypes for the course catalog.
The addition of a search feature at the top of the tiles facilitates course discovery. Users have the option to enter their own search terms or utilize the sorting feature, as depicted in the next image.
The search feature doubles as a sorting tool, enabling users to refine their course preferences based on various criteria such as their role/job, the age of the children they teach, course duration, and credit eligibility for specific programs. These sorting options cater to diverse user needs, enhancing usability for all.
The image below showcases the course tile as it appeared during the study, alongside the proposed modifications derived from the research conducted.
The title was identified as the second most crucial element of a course tile. It will continue to be prominently displayed to aid in user identification. Whenever feasible, unique wording should be employed to assist users in differentiating between courses.
This section was recognized as the third most critical area of the course tile. Consequently, it was relocated above the course unit of measure, given its higher importance according to current users. To minimize clutter, the phrase "Best Fit For" was abbreviated to "Best For," and the font size was increased to enhance visibility. Additionally, the black background was eliminated based on user feedback indicating that it hindered readability. Ensuring clarity in this section was prioritized.
The unit of measure featured on the course tile was transitioned from indicating the number of courses to detailing the time required for course completion. Analysis revealed that users prioritized knowing the number of hours, which would assist them in managing their course workload more effectively. Moreover, this information was elevated in prominence on the tile to enhance accessibility and provide greater transparency regarding the duration of the course series, thereby improving usability.
The description now focuses solely on providing relevant information about the site. Any redundant wording, such as repeating the course title or mentioning the company name, has been eliminated. This aids in the scanability of the tile.
Mockups were developed to explore potential enhancements aimed at improving the current site's usability. To progress, it is advisable to conduct usability testing before implementing any design changes. Testing and iterating on the proposed course catalog layout, including its menu with the search and sort features, are crucial steps. Additionally, confirming and further testing the usability of the proposed course tile design is essential for ensuring optimal user experience.