top of page

Redesigning the search experience to better meet user needs

Old design (Q4 2022) ⛔️

Old search design

New design (Q4 2023) 🥳

New search design

📌 But first, some context…

What does BridgeU do? 🤔

  • BridgeU is a university guidance platform that connects 16–18 year-old students in their final years of school to universities around the world

  • The web app asks students about countries + courses they're considering studying in for uni as well as things like their predicted grades

  • The algorithm then begins to suggest potential uni matches based on their answers

The problem

Of the OKRs set by the business in 2023, my squad was tasked with focusing on the following:


  1. Increase the average number of unique platform sessions per 2024 grad class student from 5 to 10

  2. Increase the % of 2024 grad class students with 3+ shortlisted unis/courses from 45% to 70%


This led us to identify opportunity areas within the product where we felt we could create the biggest impact and shift the needle. One of these areas was the search feature, because it contributed to around 30–35% of all shortlists, and we have heard feedback in the past that the experience wasn’t optimal.

Roles & responsibilies

My role: Product Design Lead

Duration: 6 months
Tools:  Amplitude, Figma/Figjam, Zoom, Condens

User group

Students in the 2024 grad class who are in the process of researching unis and courses.

I worked in close collaboration with:

  1. the Product Manager to align our vision

  2. the dev team to garner feedback on the technical feasibility of different design ideas

  3. the data team to learn about data coverage

  4. the UX Researcher to establish learning objectives and to synthesise user feedback

📊 Understanding the problem space

Reviewing past research

Research data from past generative user interviews showed that students view search as a key part of the process in finding the right unis and courses.

What we also learnt from this past research is that students found BridgeU's search feature to be a frustration for 3 main reasons:​

Students were unable to search via course/career


E.g. economics, mechanical engineering, international relations, doctor

Students were unable to search via non-country locations

E.g. London, New York, Dubai, Europe

Students were unable to narrow their results to find the most relevant results

E.g. Filters/sorts, fees, location, subject, ranking

Digging through the data

Following these insights, the Product Manager and I then wanted to understand the sort of search queries students were entering. So we jumped into Amplitude (a data analysis tool) and exported a list of the top 1000 most-common search queries entered over the last year. The I began to highligh them according to their theme and levels of complexity (i.e. university? course? city? university/course combo? etc.)

This helped me to get a solid idea of what it was that students were searching for and which scenarios and edge cases I'd need to account for in my designs.

Screenshot of top 1000 search queries

Heuristic evaluation

Before thinking about solutions, I had to understand at what point the search experience fails. Which search term themes lead to dead ends?


So, using example search queries from Amplitude above, I mapped the various flows in FigJam, annotating which type of search terms led to a dead end and where we were offering a poor user experience.

Screenshot of heuristic evaluation

Competitor analysis

Unfortunately, our competitors’ products were hidden behind a paywall, making it difficult for me to experience their products first-hand. Nevertheless, I liaised with the customer success team to find out what they knew and scoured LinkedIn and our competitors’ online help centres to understand how their search experiences worked.

Not wanting to constrain myself to the EdTech space, I also used Mobbin to pour through the search flows of different companies, including Google, Wise and ASOS.

Screenshot of competitor analysis

💡 Ideation


With an understanding of user goals, how the current experience was broken plus inspiration from my competitor analysis, we felt we had a solid understanding of the problem space.


I grabbed some A4 and began throwing some ideas down with my sharpie. After sharing in the design review, the design team and I deduced that of the sketches, concepts 2 and 3 utilised different layout patterns for the filters and cards. Whilst not bad ideas in themselves, they would have led to great discrepency in the product and a lot of tech/design debt. Whereas concept 1 and its pre-screen idea could be a very interesting way of helping students to filter and was worth further exploration.


So, I returned to FigJam for some low-fidelity exploration of this plus other concepts for desktop and mobile. Then, over Zoom I shared some of these concepts with my PM and the dev team to garner early feedback.

Here, the team helped me to understand the technical feasibility of these different concepts. Some ‘what ifs’ from the devs aided me in exploring and accounting for a wide range of edge cases before going into higher fidelity.

After padding out my thinking and getting feedback from the design team, the PM and I reviewed these flows and agreed to discount concept 4 from desktop due to its complexity on the backend.

Search case study.png
Screenshot of wireframe concepts
Search case study.png
Search case study 2.png

Moving to mid-fidelity

At this point, I took concepts 1 –3 to mid-fidelity, padding out my thinking and adding more context to the screens (real data, button copy etc.).


I did not focus so much on mobile here as any user testing would be concentrated on desktop and not mobile, given how the majority of BridgeU's users prefer to work on desktop over mobile. Plus, I had already given consideration to mobile in low-fi (see above) and would return to the mobile designs upon conclusion of the user testing.

Concept 1 – filters/sorts

Screenshot of concept 1 – filters/sorts

Concept 2 – snapshots

Screenshot of concept 2 – snapshots

Concept 3 – course grouping

Screenshot of concept 3 – course grouping

Before testing any concepts with users, I felt it would be beneficial to get some feedback on my work from outside of my immediate team.. Given what they know about our customers, who better than the Customer Success team? 

I walked a few of the team though the 3 concepts and asked for their feedback. The team were most favourable towards concept 3 (course grouping) because:

  1. they felt this concept would be the best one at summarising many of the different courses available to students, reducing the chance of students becoming overwhelmed and reducing the time-to-value

  2. the chips in the course cards could help to provide more context for each course, aiding students in their decision making

Sharing this feedback with the Product Manager, we now felt comfortable enough to get concept 3 in front of our users and see how they really felt.

🎯 User testing

Getting prepared

Liaising with the UX Researcher, I felt that observing user behaviour and having the opportunity to ask follow-up questions would be invaluable in understanding whether or not concept 3 was a viable solution. So we opted for moderated user testing over unmoderated.


The UX Researcher and I worked together to build a testing script based on the following question:

Is having a course grouping screen helpful or unhelpful to students in narrowing course options? 🤔

One problem BridgeU has is that the filters design pattern takes up a lot of space on the page and is not particularly scaleable. In an effort to solve both of these problems, I incorporated an overlay-style concept (think Airbnb) into my designs. I also requested my colleagues to do an asynchronous run through the prototype, ensuring there would be no dead ends or broken links during testing.

Whilst this was happening, we used email outreach and recruited 5 users for our testing sessions.


Uploading the recordings to (a research repository tool), I coded the recording transcripts, using an inductive approach so that I could:


  1. work at speed

  2. ensure that no piece of feedback went ignored

Screenshot of transcript from user testing

After coding the transcripts,  I synthesised the data into a report which I shared with my PM and devs. Here are the key findings from the testing.


With “London School of Economics” as the search query, most students expected to see both LSE AND universities in London that offer Economics ✅

Most students believed that these chips are trying to provide more context of different course options ✅

Most students believed that this course groupings screen shows an overview of all course options ✅


After clicking “search”, most students did not expect to see this course groupings concept screen 🛑

Whilst students were aligned in terms of their interpretation of the concept 3 screen + which filters they’d want to interact with, the majority were not expecting to see concept 3.

Taking time to further explore this concept, we came to the conclusion that whilst there were certain scenarios where this concept could be very helpful, equally there were scenarios where we felt less certain about its viability.


In an effort to build at speed, we decided not to include this extra screen in the MVP without further testing. Instead, we would go with the simplier concept 1.

🚀 The final design


My concerns about the filters design pattern currently implemented in the product was down to scaleability for future use on desktop as well as it being inadequate for mobile – something that we know from past research is the go-to device for Gen Z.

However, after conversations with the devs, we realised that adopting a new filters design pattern would have left us in serious jeopardy of releasing this work late and thus not hitting the 3rd goal of helping students to narrow their search results. So using the existing design pattern for both desktop and mobile was a trade-off we had to make.

Smaller changes I was able to make included iterations to the search results cards. One problem identified from previous user research were that students wanted to see information re the global university rank and international fees. Another was that the inconsistency in typography styles within the old search cards was confusing for users and added unneccessary cognitive load.


So I shared my iterations to search cards with the design team and used this project to agree up and log a new set of typography guidelines within Zeroheight – an unexpected bonus!

I also tightened the amount of space afforded to the search bar, tabs and filters so we could dedicate more space on the screen for the search results cards.

Once the PM had outlined the order of work based upon business priorities, I worked with the devs to break my designs down into individual cards in Trello. Taking a suggestion from a developer, I created checklists in each card as acceptance criteria – helpful for all parties when checking dev work before production.

Screenshot of the final design
search mobile.png

📈 Business impact

Total # of searches performed by year 13s (final year students)

impact 1 red.png
impact 1 green.png

% of students viewing a uni/course profile within 5 mins of searching

impact 2 red.png

Reduce the % of search terms entered that return 0 results

impact 3 red.png
impact 3 green.png

🪞 Reflections

Biggest challenge

This project posed more UX challenges than UI hurdles. Navigating through Amplitude data and expanding designs to accommodate various combinations demanded extensive thought and collaboration with stakeholders.

Most proud of

I'm delighted with the improvements to search cards and filters. We now display sought-after data (international fees and uni rankings) in search cards, as well as course cards, addressing students' needs. Additionally, we utilised existing data to create filters, facilitating students in narrowing down their choices.

A failure

Taking more time to elaborate on concept 3 would have hastened the realisation of its non-viability. This would have saved time and enabled testing of a more feasible concept—an important lesson learned.

I was really surprised by

Unifrog, our main competitor, lacks a user-friendly search compared to our vision. Despite extensive filters, it lacks a free text search, displays key info in search cards, and can't sort results by universities/courses. This insight enhances BridgeU's competitive edge for business growth.

Next steps

  1. To collect feedback on the new search, I would like to implement a pop-up survey triggered by user interaction. The survey will inquire about users' success in finding information and offer the option to share their email for a follow-up conversation

  2. Regarding design, I acknowledge limitations in the scalability of the current filter pattern. I'd have been very keen to discuss with the dev team the possibility of updating all filters across BridgeU to a more scalable option, such as an overlay (Airbnb) or a side filter pattern (Amazon)

  3. To enhance value for BridgeU's partner universities, I propose exploring a 1-column approach for search cards with a sidebar to showcase our partners. This aligns with our goal of maximizing company revenue through added value for partners

Liked what you saw? 👀

Check out more of my work below.

thumbnail 2.png


Adding new data points to university course profiles

Q1 2023

bottom of page