Babbel is a language learning app offering 14 language options to both B2B and B2C customers.


The Challenge
To develop a tool that empowers learners to practice effectively and gain confidence in their language skills.


My Role
I led the the end-to-end Product Design process, including:

• Influencing the strategy and product vision
• Conducting UX/UI design
• Prototyping
• Performing user testing
• Facilitating developer handoff


My design process 



At this stage, the prioritized solution involved:

Developing an assessment tool that enables learners to track their progress, identify areas for improvement, and receive guidance to enhance their learning journey.



During this stage, I facilitated an ideation session with the Product Manager, Engineers, Content Designer, Learning Content Experts, and Product Analytics. The objective of the session was to achieve clarity on three key topics:

1. What aspects would we like to track?
2. What information can we potentially display?
3. What recommendations can we provide to users?



Designing an Assessment
Before starting the design, I needed to address several key questions:

• What are the different types of assessments available? Examples include placement tests, CEFR assessments, quizzes, etc.
• What exactly is a CEFR assessment?
• What approaches do our competitors take in their assessments?
• How many questions are needed to accurately determine someone's proficiency level?
• The Babbel Assessment considered an official certificate, similar to those from Oxford or TOEFL?

Design decisions after some iterations:
• We needed an onboarding screen to set expectations about the test. Is important to the user understand the duration of the test and the skills that will be tested.
• We had to create a different look & feel from regular lessons, but still follow Babbel's style. In regular lessons, users get immediate feedback if they got the right or wrong question, but in this case, we had to change this behavior




Additional Insights from User testing:

Through our research, we discovered that the Results Screen is the most crucial component of the user experience. Learners have high expectations for this screen, anticipating detailed feedback and clear guidance on areas that require improvement.
One key improvement we made after the first release was to add the option for learners to check all their answers. We discovered that learners find it incredibly helpful to see both their correct answers (strengths) and mistakes (weaknesses) as part of their learning experience.




Check the final prototype: