Background

Seperia is a performance marketing company that builds and operates review sites with high-intent traffic. Reaching numerous users across the globe, these comparison sites give the bottom-line details of top brands (in industries such as fintech, management software and more), helping individuals make a decision. At Seperia, we wanted to improve our mobile product comprehensively, by conducting competitive analysis and basing our designs in research and data. We wanted to apply some of the insights to additionally improve our desktop resolution. While each of our comparison websites has its own user base, the goal of this project was to take a macro-level approach, to redesign and restructure our comparison tables in general to create for a better user experience. The following redesign of the “Accounting Software” site is a simulation project of what we achieved.

My Role: Researcher, UX/UI Designer

 

Goals

  • Create a “new and improved” mobile and desktop product, following best practice for building a clean, easy-to-use structure for brand comparison.

  • Conduct competitive analysis of various comparison sites (paid traffic) with the marketing team, and analyze both site data and design of the comparison page.

  • Collect patterns in UX/UI across the seemingly best performing competitors’ sites.

  • Redesign and test our new comparison tables, seeing how they perform against our previous designs.

 

Process

 

Research

 

Research Goals

  • Track and compare marketing metrics of competitor sites, using web analytics from Semrush, Similarweb, and/or Ahrefs.

  • Rank the competitor sites based on their “performance”, focusing mainly on the highest-performing sites.

  • Examine the designs of those sites’ comparison tables, collecting patterns and understanding best practices for UX/UI in this domain. 

 

Competitive Analysis

We wanted our redesigns to be based in data, so in collaboration with the marketing team, I conducted competitive analysis. We examined and collected data from a variety of companies that run paid advertising for their comparison tables (in industries of mortgage loans, website builders, antivirus software, and others). In an ideal situation, we would be able to assess performance data of such websites, including conversion rates from the sites themselves. Since such analytics are unavailable to the public, we focused on the available marketing metrics that could still give us an understanding of a company’s standing, site success, and legitimacy.

Using Semrush, we examined the following marketing metrics for our competitors (illustrated in the spreadsheet below): 

  • Traffic Cost - If the monthly cost is high, we assume the company is large in scope, and invests heavily in its advertising to support the business goals.

  • Paid Search Traffic - If the monthly number of visitors is high, we assume this is a “big site”, legitimate for using as a reference regarding design decisions.

  • Bounce Rate - If this percentage is on the low end, we assume the site keeps users interested, and are more likely to convert.

 
 

The above excel spreadsheet ranks a sample of the competitor sites, based on performance in the aforementioned marketing metrics. Rows marked in green have a relatively high cost in traffic, high number of monthly paid search traffic, and low bounce rate. Therefore, these particular comparison sites (and those marked in yellow - the mid level performing sites) helped serve as the benchmark and inspiration for our site redesign.

The final column of the spreadsheet, “Mobile Table Screenshot”, documents how each site’s comparison table looks and feels on mobile. I consolidated all of the screenshots on a separate file, in order to compare and analyze the visual designs better. Categorized by the sites marked in green (best performance) and yellow (medium performance), below are screenshots of the various sites’ comparison tables.

 

Design

 

High-fidelity Mockups

When analyzing the UX/UI designs of the mid- and high-performing comparison tables in the market (as shown by screenshots above), several patterns emerged. These design patterns and characteristics presumably demonstrate what’s considered to be “best practice” for mobile comparison tables (which our current comparison table design lacks):

  • The date shown in the hero section often includes “last updated” and/or an icon (e.g. a checkmark). WHY IT MATTERS: This signals that the site is updated and relevant.

  • Brand listings appear as a “cards style”, on top of a background, with gaps between the listings. WHY IT MATTERS: With this layout and structure, it’s visually easier to digest the information and compare between the brands.

  • Brand #1 shows a banner/badge, often with an icon. WHY IT MATTERS: When designed well, this highlights the top brand, showing a standout feature or opportunity.

  • USPs / Text lines are often left-aligned (not center-aligned) inside the brand listing. WHY IT MATTERS: Left-aligned bulleted text is easier to read, and it works well with the left-aligned logos and structure.

  • The rating system (editorial score) for the ranked brands often includes stars, and is typically with some numerical score out of 5 or 10. WHY IT MATTERS: stars, paired with a numerical score, add visual engagement - which may help users be drawn to the more highly-rated brands and click on them.

  • Buttons are commonly shown as short-width with white text, and placed to the right of the brand logos. WHY IT MATTERS: White text on boldly/brightly-colored buttons reflect a more modern design style.

    I incorporated many of these design insights to transform our mobile comparison table. The before and after designs are shown below.

We wanted to apply some of the aforementioned insights to also redesign our desktop resolution. Besides translating several of the above design concepts to desktop, I additionally improved the following:

  • Fixed visual and functional glitches (e.g. table frame alignment and spacing issues)

  • Changed table structure (no column titles, and separated brand cards of information) 

  • Adjusted coloration for better contrast and accessibility 

  • Highlighted a key feature per brand, for faster browse and visual hierarchy

  • Removed visual clutter, and created an overall more mature look and feel

 

UI Kit

 

Test

 

Site Analytics

Testing our designs takes a collective effort between marketing, design, product, and everything in between. Each aspect of the marketing funnel requires a user-centric approach, and together, creates a comprehensive user journey.

In our testing strategy, we ran a Google campaign with ads focused in the consideration and conversion stages. During each period of testing, all users were shown the same ad. Then, half of the traffic was sent to the old design, and the other half was sent to the new design. In this split testing, we targeted only a UK audience. After collecting the data for a few months, the new design (compared to the old design) had the following metrics achieved: 

  • 2x the number of registrations (i.e. users signed up for a brand’s service twice as much when they were shown the new design) 

  • Higher number of qualified leads (which indicates a high potential to turn into sales) 

  • Higher conversion rate (CVR) from registration to qualified leads

  • More cost effective by metrics of cost per click-out, and cost per registration 

  • Overall more sales generated for the top-listed brand

 

Key Takeaways

Based on much of the data from split testing, we can conclude that the new design outperforms the old design. We can also likely assume that the conversions and positive results are due to the design and product differences, rather than differences in marketing or campaign strategy. This is because all users - traffic sent to the old design and traffic sent to the new design - saw and clicked on identical ads that had a very similar click-through rate (CTR). Our goals moving forward include the following:

  1. Since the new comparison table page design consistency performs better, there’s little sense in splitting the traffic and sending users to the old design. We’ll proceed by sending all UK traffic to the new design.

  2. Testing the design with UK traffic proved itself to work, so we want to broaden the scope: having other target audiences (from other geographic locations) also experience the new design.

  3. As this project has taken a macro-level approach to redesigning and restructuring the comparison table, we want to test this kind of design on our other comparison websites (beyond accounting software). This would include A/B testing to see what works across different industries, target audiences, geographic location, etc.

  4. We’ll continue to track desktop and mobile design trends, particularly in the comparison space, to further improve the table design and overall product.

 

View More Projects