top of page
8 Billion Minds is a platform for live learning that facilitates knowledge exchange worldwide, completely free of charge. Users can share, teach, or learn about a wide variety of subjects in a casual and informal setting
A beta site was created and needs testing with real users. This is where I come in. No testing was conducted prior, and it is necessary to validate current design decisions.
I wanted to test the current beta site with real users to be able to measure the success rate for important key user tasks. This would uncover problems, discover opportunities, and learn about user's behavior and preferences.
I wanted to assess the learnability for new users interacting with the 8 Billion Minds beta site for the first time on a desktop device. I wanted to observe and measure if users understood the site's basic functions and navigation to access lessons and teachers. Finally, I wanted to identify any pain points and provide recommendations to address the issues.
11 Weeks on part time schedule
Sr UX Designer, PM, Founder & UX Designer (myself)
Capture user feedback on current user experience of beta site
Preparation is key to successful usability testing. My role was to plan, conduct, analyze, and present the findings.
Before beginning the testing, it was important to create a testing plan to stay organized and focused. The following usability testing process was created:
CREATE TEST PLAN
- Identify goals
- Create test script
- Recruit users
FACILITATE THE TEST
- Conduct test
- Observe & record user behavior
- Notetate pain points
ANALYZE & SYNTHESIZE
- Review video and assess user
- Affinity map
- Provide user centered
BREAKING DOWN THE PROCESS
Once the scope of work was defined and with the goal in mind, I defined my test objectives. Next, I wrote the test script and began recruiting participants.
Create a Test Plan
Identify Testing Goals
I set three main goals to accomplish in this usability test:
To observe how easily users could complete the onboarding and sign-up process.
To determine if participants could quickly and easily understand the purpose of the site and the value it provides.
To observe the usability of important key features such as searching for a subject/teacher, requesting a session, sending a message, and finding resources."
These are the final tasks that users will need to complete during the testing session:
Complete the onboarding steps 1-3.
Create an account.
Search for a subject that you are interested in learning more about.
Search for a teacher.
Send a message to a teacher.
Request a session.
Find your resources.
The methodology I decided to use for the usability test was remote moderated testing. I used Zoom as the recording platform for all tests. The test consisted of a short briefing before the session, task performance using the beta site on a desktop device, and a debriefing session.
Five participants were recruited for the usability test through a screening process based on basic demographics to ensure that they represented the target users. The participants were sourced from a contact list of colleagues and friends.
Usability Metrics: Success Rate
Errors will be measured using Jakob Nielsen’s scale:
- 0 = I don't agree that this is a usability problem at all
- 1 = Cosmetic problem only: need not be fixed unless extra time is available on project
- 2 = Minor usability problem: fixing this should be given low priority
- 3 = Major usability problem: important to fix and should be given high priority
- 4 = Usability catastrophe: imperative to fix before product can be released
FACILITATE THE TEST
To get started, I explained the agenda for the session and made sure to put them at ease by initiating a conversation with general questions.
Once they were comfortable, we began the usability test. I directed them to the beta site and explained the tasks one by one, encouraging them to think out loud. I also had them sign a video recording consent form.
During the test, I observed their reactions, identified pain points, and noted valuable feedback as they completed each task. I also redirected the users to the task at hand when necessary.
Conducting Remote Tests
Remote testing was held from June 10 -14, 2022 using Zoom. Participants were provided with details about the session, including a video link and a consent form, via email prior to the testing date. The session was allocated 20-30 minutes, which included a short briefing before the tasks and a debriefing afterwards.
In total, we observed 35 attempts to perform tasks. Of those attempts, 10 were successful, 20 were partially successful, and 5 were failures. For this particular site, we assigned half a point for partial success.
Therefore, the success rate was calculated as (10 + (20 * 0.5)) / 35 = 57%."
ANALYZE & SYNTHESIZE
After reviewing the recordings of the testing sessions, I identified the pain points that each user experienced and used affinity mapping to categorize and synthesize the data.
Search for a subject: Some users questioned if their subject was available and were confused with the suggested list. They thought they were limited to the provided suggested list.
Completing onboarding steps 1-3: One user wasn't sure if the example greeting was an example or something someone had typed. All completed steps 1-3, but all expressed some sort of uncertainty about the form being used.
Creating an account: All users did not like having to figure out their time zone.
Searching for a teacher: After searching and coming to the matches screen, some users questioned if the available cards were teachers. It was not as obvious. They would all like to see more details on bios and experience.
Requesting a session: Two users skipped the duration feature, and all had to think about their time zone.
Sending a message to teacher: Most users were confused by the button stating "message send, close here" after they submitted their messages.
Finding resources: All users spent time clicking on the "about us" icon or info button, hoping to find help or resources. None found it.
Adding their time zones: All users expressed frustration with the current experience of figuring out their time zone.
FEATURE PRIORITIZATION 2x2 MATRIX
To help prioritize the issues, we used a 2x2 matrix for feature prioritization. This allowed us to rank the categories of issues according to their importance to the business (x-axis) and to the users (y-axis). The founder, another UX designer, and I then worked together to organize each issue in its corresponding area and label them in order of importance.
CREATE TEST REPORT
I created a test report that included each of the eight pain points uncovered through testing. Each pain point had a recommendation that was user-centered. I presented my findings, insights, and recommendations to the founder, product manager, and other UX designer. Below are the top 3 issues identified from the 2x2 matrix above.
One pain point was that all users found it difficult to enter their time zone during the sign-up process. Most users were unsure about how to use the time zone feature, and they expressed frustration with having to figure out their time zone. Based on their feedback, additional features were added to make the sign-up/login process more user-friendly, faster, and more secure.
Beta sign up form
Beta time zone list
Recommendation: Remove the time zone list and add a "find my location" feature to automatically set the time zone based on the user's current location. Additionally, consider using a more familiar and intuitive form for the sign-up/login process to improve the user experience.
Onboarding Steps 1, 2 & 3
Although all users were able to complete all three steps of the onboarding, they expressed uncertainty about the form being used.
Our first solution was to change the order of steps, as starting with the introduction screen was found to be intimidating for some users.
1. Introduce yourself to the community
2. What subject do you want to learn?
3. What subjects can you teach?
The old order of the onboarding steps may have been intimidating for some users, as they were immediately asked to introduce themselves to the community before even being sure if they wanted to use the platform. As a solution, we decided to change the order of the steps to make it more user-friendly.
1. What subject do you want to learn?
2. What subjects can you teach?
3. Introduce yourself to the community
The new order you proposed looks good and seems to address the issue of users feeling intimidated by the introduction screen in the old order.
Since the goal of the site was to provide a live learning platform that fosters knowledge-trading around the world, and users felt uncomfortable introducing themselves, the new order mentioned above was implemented.
Beta onboarding steps 1, 2 & 3
Recommendation: Our recommendations include updating the form to a more common format, redesigning the UI to align with the brand, removing the footer from onboarding screens, and updating the action buttons.
Pain point: All users found the search box on the landing page, but some questioned if their subject was available and were confused with the suggested list that would appear as they input their subject. They thought they were limited to the list.
Insight: Allow the user to. be the first user to create a subect category if their subject isn'f found
Insight: Give the user options to get engaged and get excited to sign up
Beta Search/Landing Page
Recommendation for Search/Landing Page
Improved accessibility by adding contrasting colors for the action but
Improved typography by reducing the number of lines in the title and subtitle, changing the font, and making better use of negative space for improved readability.
Some users were not sure what to search for, so to help engage them I added an "Explore" button, which allows them to browse and discover new possibilities.
Testing and Reiterating
In order to validate my design decisions above, my next step is to conduct yet another usability test to measure the success rate of the updated features and UI.
bottom of page