How to A/B Test LinkedIn Ads

INTRODUCTION

A/B Test graphic

According to Smart Insights, A/B testing is the top method used by marketers to optimize conversion rates for businesses. That’s because it’s a data-driven way to enhance experiences for potential customers, which ultimately translates to results. 

In fact, large companies like Google and Amazon continuously A/B test various aspects of the user experience to find out what resonates with their audiences and what doesn’t. Then they make strategic decisions based on the data.

While your business may not be reaching millions of people, you can still implement the principles of A/B testing in your company’s marketing strategies, including your B2B LinkedIn ads. 

Following are some tips on how to A/B test LinkedIn ads to maximize conversions for your B2B business. 

HOW TO CREATE DIFFERENT VERSIONS OF ADS

If you’re new to marketing, the term A/B testing may mislead you to think that you need to create two versions of an ad to see which performs better. In reality, it means you are testing two (or more) variations of a single component of an ad. For LinkedIn ads, this means testing the introductory text, images or videos, headlines or call-to-action buttons.

What’s more, you also can A/B test different audiences so you can hone in on what specific groups respond to. In the end, you could end up creating multiple versions of ads before determining which format drives the most conversions. While this may seem a bit overwhelming, the time you put in can result in a higher return on investment. Plus, it can help direct future campaign initiatives.

Unlike platforms like Facebook and Google, LinkedIn doesn’t have a dynamic creative ad format, which automatically generates different combinations of ads from creative elements you provide. Thankfully, LinkedIn does have a “duplicate” feature that makes manual testing fairly painless. 

LinkedIn ad A/B image test

Start by creating a campaign with a single ad. Then copy that ad and change the component you’d like to test. Since LinkedIn users tend to respond to visuals first, you can begin by testing different graphics or photos. Create one ad with image A and another with image B (and others with images C, D, E, etc. if needed) and keep everything else the same. A true A/B test only tests the performance of one element at a time. 

Make sure to label each ad (and each campaign) so you can easily reference them later. You also should add tracking codes to the destination URLs in your ads and enable LinkedIn’s conversion tracking feature so you can see how each ad contributes to your goal. For instance, you can set up LinkedIn conversions to track purchases, lead form submissions or downloads from your website.

Once you determine the winner (meaning, the best performing ad), you can move on to the next ad component using the top-performing image. This time you could test different copy styles to see if your audience responds best to benefit-led copy or empathy-led copy, as an example. The language you use also can help you determine which stage of the sales cycle your leads are in, which can help with retargeting efforts later on. 

Continue to test each of the remaining components of your ad until you narrow it down to the top-performing version(s). Then you can scale up your ad spend using the creative elements that are most likely to get you results. 

HOW TO DETERMINE WHICH ADS ARE EFFECTIVE

Use LinkedIn’s campaign manager to view performance insights for your ads. The default dashboard usually displays metrics such as total spend, clicks, impressions, conversions and costs related to those metrics, among others. You also can adjust the view to see other performance attributes if needed. 

Look at conversions, conversion rates and cost-per-conversion first, as those are the key performance indicators for your ads. In other words, those metrics are directly tied to your main objectives, like lead form submissions, sales or downloads. Meanwhile, impressions, clicks and click-through rate indicate how your ads are delivering and how your target audience(s) are responding to them. Together, these performance data can help you determine which versions of ads are most effective.

Cross-reference the results from the campaign with your web analytics and your internal records to ensure accuracy. You also can compare your data with industry benchmarks and data you’ve previously collected to help measure success. 

Use A/B testing software or an A/B significance calculator to confirm whether there is a statistical significance between ads before adopting a specific version. Sometimes one ad may be more likely to improve the conversion rate compared to others. Other times, all variations will have the same (or a similar) likelihood for success. 

In general, you should allow tests to run for at least two weeks before you analyze performance or make changes. It takes time for the platform to show your ads and collect meaningful data. 

FOR MORE INFORMATION