facebook ad tests
Topics
Related Categories
Related Courses
Free Resources

3 Facebook Ad Tests: How We Increased Return on Ad Spend 245% & Improved CPL 41% & 55% (By Clicking a Single Button Each Time)

Sometimes one little button can make all the difference in the world.

Case in point: these three Facebook ad case studies. Clicking one little button can dramatically change the performance of your ads. ~ Molly Pittman

In each of the three Facebook ad tests you’ll see in this post, I’ll explain how clicking one little button can dramatically change the performance of your ads.

So, if you’d like to increase your return on ad spend (ROAS) by 245%…

Or improve your cost per lead (CPL) by 41% or 55%…

Pay attention.

You’ll see how we did it by changing just a single campaign setting inside of Facebook.

Facebook Ad Test Case Study #1: How We Improved Cost per Lead by 55% by Changing the Campaign Objective

Once upon a time, we were hosting a webinar, and we wanted to run Facebook ads to get more people to sign up for it.

This should have been a cut-and-dry operation but there was one hitch:

Due to a technical snafu, we couldn’t place our Facebook pixel on the webinar software’s Thank-You Page…

…which meant that we weren’t able to tell Facebook to optimize for conversions.

As a result, we had to optimize for a different objective: links clicks.

Now as you can see here, a single click is all that differentiates between these two objectives:

Selecting your Facebook marketing objective

But oh, what a difference that one click can make!

We ran this campaign using “Traffic” as our objective. Basically, this told Facebook to show our ads to people who are most likely to click on our ad (but NOT necessarily the most likely people to complete the webinar registration process).

When running this campaign, here were our results:

  • Spent: $1,500
  • Generated: 76 webinar signups
  • Cost: $19.74 per lead

Now that was acceptable, so we let it continue to run. But I knew I was missing out by not being able to optimize for conversions.

About a week later, I got good news. Someone was able to solve our pixel problem, which meant that now I was able to optimize for conversions instead of link clicks!

So, I paused the Traffic ads and started a brand-new campaign that was optimized for conversions.

Keep in mind that everything else about these ads was exactly same:

  • They had the same targeting options
  • The same pictures
  • The same copy

Everything was identical. Except for the objective.

And this wasn’t even a “seasoned” pixel. In other words, when I started this campaign the pixel was brand-new; Facebook had zero conversion data. They were starting from scratch.

Here were the results:

  • Spent: $1,300
  • Generated: 145 webinar signups
  • Cost: $8.97 per lead

CPL cut in half and then some? That’s a MASSIVE improvement. A 55% decrease in our lead cost.

Takeaways From This Test:

  1. Pay attention to every single setting in Facebook. Because one little wrong click can make a BIG difference.
  2. Don’t try to trick Facebook — just tell it what you want. If what you really want is conversions, then optimize for Conversions. If what you really want is link clicks, then optimize for Traffic.

Facebook is really good at optimizing your ads to give you what you want. You just have to tell it what that is.

Facebook Ad Test Case Study #2: How We Improved Cost per Lead by 41% by Skipping Facebook’s New Ad Delivery Option

OK, so we’ve established that it’s important to be honest and tell Facebook what it is you really want from your campaign.

But recently Facebook started adding an extra option underneath the “Optimization for Ad Delivery” section. This new option allows you to (in Facebook’s words):

“Optimize for link clicks until there is enough data to optimize for conversions.”

Here’s what it looks like:

Selecting "optimize for link clicks" under Optimization & Delivery in the Facebook platform

In other words, Facebook will FIRST optimize for link clicks and THEN it will start to optimize for conversions later.

It’s an interesting option… and intuitively, it makes a lot of sense.

By optimizing for link clicks first, Facebook is able to gather more data, faster. Then it can use that extra data to speed up the optimization of your campaign later on… right?

"The results were a lot different when we actually tried it for ourselves."Well, that’s the theory.

In reality, however, the results were a lot different when we actually tried it for ourselves.

We first tested this option to cold traffic (people who aren’t familiar with our brand). We created two identical campaigns, both of which were optimized for conversions.

The only difference was whether that “Optimize for link clicks…” button was clicked or not.

Our results were a little surprising:

  • “Optimize for link clicks…” WAS clicked: $10.30 per lead
  • “Optimize for link clicks…” was NOT clicked: $7.91 per lead

We ran a similar test to warm traffic (people familiar with your brand but haven’t bought yet), and the difference was even bigger:

  • “Optimize for link clicks…” WAS clicked: $12.22 per lead
  • “Optimize for link clicks…” was NOT clicked: $7.21 per lead

Clearly, this option did not work for us. Our CPL was 41% lower when we unchecked that box and simply optimized the campaign for conversions.

So, what’s the deal? Does it simply flat-out not work?

Not necessarily. Keep in mind a couple of things:

First of all, this is a strategy that, by its very design, will take some time to really be effective. These campaigns may not have been running long enough to take full advantage of this feature (each ran for a week while we collected the data above).

Second, this option might be better-suited for ecommerce companies or other businesses selling big-ticket items that don’t generate that many conversions.

But whatever you do, test it! That’s the only way to find out for sure what will work for YOU.

In our case, we believe that Facebook generated more click data than conversion data… so even though Facebook was generating more clicks at the beginning of the campaign, those clicks weren’t coming from the people who were most likely to convert.

Takeaways From This Test:

  1. Consider not clicking that button, especially if you expect to generate 100+ conversions within the first few weeks of your campaign. In our case, optimizing for conversions was significantly more effective.
  2. If you think this option might work for you, at least test it. Let the data tell you what works and what doesn’t.

Facebook Ad Test Case Study #3: How We Doubled Conversions by Optimizing for Customer Lifetime Value

In this last test, we wanted to see what the results would be if we used Facebook’s new “Optimize for Value” option:

"It's a really smart option for Facebook to offer."So what is it?

Basically, this option tells Facebook to optimize for high-value customers to bring you the most total revenue possible.

So, rather than optimizing for conversions alone (i.e. maximizing front-end sales), we are telling Facebook to optimize for high-value conversions (i.e. maximizing total revenue).

It’s a really smart option for Facebook to offer.

So, we ran a split test to compare the results of optimizing for conversions vs. optimizing for lifetime value.

(And of course, just like before, everything else about the campaigns was the same.)

So, what were the results?

  • Optimizing for conversions: ROAS of $1.17
  • Optimizing for value: ROAS of $2.87

In other words, in the first campaign, we generated $1.17 for every $1.00 we spent. In the second, we generated $2.87 for every $1.00 we spent.

The results here are pretty clear and clearly pretty awesome.

The campaign that was optimizing for value beat the standard “optimize for conversions” campaign by a margin of 245%!

"One wrong click can have a big impact on the performance of your campaign."It didn’t necessarily generate more conversions. It actually generated fewer total conversions. But the revenue per customer was higher, leading to that 2.45x increase in revenue.

This option won’t work for everybody. If you sell only one product at a single price point, then feel free to skip it.

But if you sell different products at different price-points, and you want to maximize the number of high-ticket items you’re selling…

Then definitely give this ad set objective a try! It’s well worth your time, as you can see from the results above.

Takeaways From This Test:

  1. Always remember to keep in mind the numbers that matter most for you, and optimize for the most bottom-line metric you can. For the majority of companies, ROAS is more important than total sales because it factors revenue into the equation.
  2. Anytime you’re able to optimize for revenue rather than sales, do it! One great way to do this is to test using Facebook’s new option, “Optimize for value.” The potential with this new targeting ability is huge.

One Little Button Can Make A Big Difference

When setting up ads in Facebook, you should never click on an option willy-nilly.

Always know what you’re clicking and why.

Because as you’ve just seen, for better or worse, one wrong click can have a big impact on the performance of your campaign. "Always know what you're clicking and why."

I hope these case studies gave you some ideas and inspiration for your own campaigns. Now get out there and make a case study of your own! 🙂

DigitalMarketer

DigitalMarketer

The lovely content team here at DigitalMarketer works hard to make sure you have the best blog posts to read. But some posts require a group effort, and we decided to stop the rock-paper-scissors tournaments that decided the byline so that we had more time to write. Besides, we all graduated from kindergarten: we can share.

Subscribe to the DM Insider Newsletter

Subscribe to our weekly newsletter that delivers the most actionable, tactical, and timely marketing tips you actually need in 7 minutes or less. Get an edge over the competition, for free.