Episode 108: How Facebook’s Ad Algorithm Works

how facebooks ad algorithm works

Join the experts and guest Drew Tweito of FunnelBoom as they detail how Facebook’s ad algorithm works and how you can use it to scale your Facebook campaigns. They’ll share how to work with the Facebook algorithm so it does the work for you, so you can increase your ad’s ROI while spending less time in Facebook’s Ads Manager.

 

IN THIS EPISODE YOU’LL LEARN:

  • What golf and Facebook’s ad algorithm have in common (« hear Drew’s metaphor that will give you insight into the algorithm).
  • A scaling strategy you can use to specifically target your market and let the Facebook algorithm do the heavy lifting.
  • How many days to let a Facebook ad run for (« and why tinkering with your ad too soon can actually hurt your campaigns).

LINKS AND RESOURCES MENTIONED IN THIS EPISODE:
Episode 71: The Michigan Method: A Strategy for Scaling Ad Campaigns
Visual Walkthrough and Checklist from FunnelBoom
Episode 108 Transcript (swipe the PDF version here):

Keith Krance: Hello, and welcome to Episode Number 108 of Perpetual Traffic. How are you guys doing?
Molly Pittman: Doing great!
Ralph Burns: We’re good.
Keith Krance: Today, we’ve got a special guest. We’ve got Andrew Tweito from FunnelBoom on today to talk about the Facebook Ads Algorithm. I think a lot of you might be excited to kind of hear what we’re going to get into as we get into the middle and the end of this episode because, for some of you, simplification, and actually simplifying might be the best answer for more ROI. I think Andrew is going to talk a bit about machine learning.
  We hear all the time about how smart the Facebook algorithm is. I think just opening it off, kicking it off, can you just try to explain to us exactly what that means, and how it kind of applies to the algorithm, and how that affects us as business owners and marketers?
Drew Tweito: When we are talking about machine learning or the “algorithm,” we’re talking about how does Facebook look at the performance of our ad-set, how does it consider our audiences, the pixel data we’re sending them, and combine all those factors to help you reach the most valuable people at the ad-set level?
Ralph Burns: Yeah, and now just to add onto that, I think, one of the outcomes of this episode here is going to be good news for a lot of people who are probably doing too much work on their current Facebook ad campaigns. I think we’re all guilty of it. We think that because the element of advertising that we can control is our own actions, we can’t really control what Facebook does in the auction, in the algorithm itself. I think there is a tendency to actually complicate things by doing too much.
  On Episode 71, we go into the Michigan Method, which is a strategy for scaling ad campaigns. It’s really, it’s a good starting point for a lot of sort of mid-level advertisers. Might not be the best way to start things when you’re first starting out, but once you actually do you’re testing, and yeah, the Michigan Method, if it’s done correctly, actually can produce the outputs that you can then use in larger scale campaigns and let the algorithm really do the work.
  Now, there are updates going on constantly in the algorithm. I think Andrew can get into this even more, is that well, think about the algorithm is like a software update. You know when you boot up your laptop it says: “Microsoft didn’t now update it, would you like to update now?” Or app updates, like constantly the algorithm is updating, not necessarily changing per se, but I do think because the newsfeed is so crowded right now, and many, many people are bidding for a place in the auction, a seat at the auction, then Facebook is trying to maintain the user experience to not alienate its customer base, but also balance that with the best advertisers to enhance the user experience.
  I think that the algorithm, the machine learning, is constantly updating, not necessarily changing per se, but you’ve got to update your strategy alongside of it. Hopefully, we’ll get into some real actual strategies here on the show.
Molly Pittman: It’s interesting, this episode’s like very two-sided in a way of like incredible balance because a lot of what we’re talking about is really technical and that’s what I’m excited to hear about. I’m not very technical and honestly, I look at things way more from a strategy standpoint. Understanding more about how it works is great, but on the polar opposite side of that is also your mindset, so understanding the machine but then also understanding that a lot of this has to do with the way you’re thinking about it.
Keith Krance: We were chatting before we hit record. Andrew had a great metaphor, what were you saying again?
Drew Tweito: What I was talking about earlier was a golf analogy, if you’re a golfer, or you’ve been around a driving range before, you’ve probably seen this guy at the range, or at the golf course who is swinging the club so hard, literally sweating, and putting every ounce of energy he has into trying to hit the golf ball. Well, if you watch the guys on TV, like, if you’ve ever seen Ernie Els play golf, you’ll see that it looks like he’s barely swinging but he can hit the ball 300, 350 yards, whereas the guy at the range is usually the one who’s also like topping the ball 20 yards, or chunking it. He’s barely making contact.
  In golf, a big saying is “Let the club do the work,” right? That’s what the PGA Tour players do, that’s what the best golfers in the world do is that title lists or whoever made that club has made it to hit the ball far. You don’t have to do that much at the end of the day to make it go far and to get it straight.
  The same thing is kind of true for Facebook in that the ad-auction technology is built to find the highest-valued people for you. Sometimes it’s better, and like Ralph was alluding to earlier, we could start off by testing and trying to manually find these right audiences, the right placements, the right creative, that’s all good.
  When we’re trying to scale, sometimes it’s better to just let Facebook loose. Let them do the work. Let the technology do the work for you and be a little more hands-off because through their data science and through the machine learning, they’re going to be able to identify, looking at hundreds and hundreds of thousands of variables who out of that huge big audience they should be going after, much better than we ever could doing it manually.
Molly Pittman: What do you think goes into play there, Andrew? Obviously, who you’re targeting, right, so who you’ve said you want to show this audience to, what you’re optimizing for, so what you’re telling Facebook that you want, whether it’s conversions, or clicks, or whatever you’re asking for. What would be the next level of that? From your knowledge, what else is controlling that?
Drew Tweito: What’s controlling their ability to identify those people?
Molly Pittman: Exactly, yeah.
Drew Tweito: I believe there’s about 272,000 attributes they have for each person.
Molly Pittman: Whoa.
Drew Tweito: For each person that is on Facebook, and this manifests typically through things like interests, so what you can select in the audience insights or in your targeting. This would be like is Molly interested, or what’s her interest set? What’s her demographics? How old is she? I don’t know the exact dimensions they’re using there.
Molly Pittman: No one really does, but just broadly, because that’s interesting.
Drew Tweito: Yeah, I think Zuckerberg has them written down on a piece of paper in a safe somewhere or something.
Ralph Burns: Right next to the formula for Coke.
Molly Pittman: Do you think a lot of that has to do with what you’ve shown interest in, right? Obviously, like liking pages or commenting on certain posts, like your behavior inside of Facebook. Obviously, it has something to do with your behavior on websites too, since most websites have the Facebook pixel installed. I mean, they have to know who’s most likely to go a certain number of pages into a website, or to take a specific action. Do you think that comes into play?
Drew Tweito: Yeah, I think there’s a ton of behavioral attributes in there. We’re not just talking about more static things like age, or what you do for a living. I mean, that can be a bit more static but they’re also looking at data … Facebook’s not the only data platform out there that has this but they know what types of sites you’ve been on lately, and that’s recency. That’s really important. That’s part of how the machine learning works is that it sorts through the variables and finds out what are the most important for your particular ad-set.
  I would never underestimate the power of some of the engineering teams at these companies, things like text analysis. They take your posts, they analyze it. They pull out keywords. They look at pictures that you post. You can use machine learning to figure out where the person is, what they’re doing. At a more basic level, yeah, like what sites you’ve been to, what things you’re shopping for.
Molly Pittman: What do you think when it comes down to a conversion standpoint, like the photo, that’s interesting, figuring out where you are. Because I have noticed when I’ll post a picture on Facebook, or when I land in a new city it’s like “Hey, here are other friends who have visited Dallas.”
Drew Tweito: All right, right.
Molly Pittman: Like, “See what they’ve done.” In terms of the conversion aspect, from a purchase or an opt-in standpoint, what do you think they’re looking at? When you tell Facebook, “I want conversions. I want people to buy, or opt in,” what do you think they’re looking at there? Again, everybody, we’re totally speculating. This is not from Facebook, but it’s a good discussion and stuff I think about a lot.
Drew Tweito: At the end of the day, the machine learning, or the data science, like no matter where it is, Facebook, Tesla, whatever company it is, whatever data scientist team is there, there’s typically a set of pretty commonly used methods to do this type of stuff. It’s just around prediction, right? There’s a number of ways to do it. I don’t know which specific one they’re using.
  I will say that when they’re trying to predict conversion, what they really want to do at the end of the day is say okay, we have multiple objectives here. We have multiple competing goals. The advertiser wants return on ad spend. We need to get them that for them to continue to advertise with us. They want in this case CPA, they want to lower their CPA.
  The platform, Facebook, us, we don’t want to burn through our user base. We want engagement, and stickiness, time on platform. Then the auction in and of itself has to then sort through everyone’s objectives and then figure out okay, a lot of advertisers want this placement, who gets it? There’s a number of factors that go into that, but in terms of the conversion, how they decide based on a conversion objective, they’re going to look at the historical converters first.
  That hard and fast rule that we used to talk about you need to have X-number of conversions per day, it’s not 100% true. It’s not like if you don’t get it your ad’s just going to turn off and stop delivering in most cases. What they’re doing is they’re taking the people who have converted, so let’s say that over the course of three days you get 30 conversions, right?
  Now they’ve got 30 people, each of those 30 people have a huge set of attributes that Facebook knows about them. They look through it and they say, “Well, what of these sets of attributes is common?” They can weigh those and then they look at the people who converted and the people who didn’t convert. They go, “Alright, where’s the pattern here, where’s the big differences?”
Molly Pittman: Right, yeah, how are they similar?
Drew Tweito: Exactly, and that’s what machine learning really… the data science, the predictive modeling really does is it looks at these attributes and then tries to figure out okay, what’s predictive about this data set. How can I predict an action from underlying features or attributes?
  It’ll look first at the historical converters and go okay, now from this set, I can kind of start to predict a conversion rate for the rest of the user base, or the rest of the target in that ad-set, so when the auction comes up they already know Molly has a 4% chance of converting, Keith has an 8%.
Molly Pittman: Wow.
Drew Tweito: Yeah, Ralph has 20 and so forth. Then your percentages will be different than every other ad-set in the auction. You can imagine the size of this data. It would be insane, but part of it is like, okay, well, at the end of the day, we want to deliver like what’s called a true value or expected value. That’s where the bidding comes into play and that’s where the return part comes into play.
Molly Pittman: That’s great, I’ve never heard such a good description of that, Andrew. That’s a really good description because when we teach, we’re just like yeah, optimize for conversions. Facebook is going to show it to people who are most likely to convert. That’s all I can really explain. Like thinking about how that correlates with the auction, and that also goes to show why ad costs are so different between markets, right?
Drew Tweito: Oh yeah, yep.
Molly Pittman: It also goes to show why certain offers, in my opinion, have a really viral effect. Especially when we’re running conversion campaigns to optimize for leads, some of our Lead Magnets, the campaigns just never get off the ground.
Ralph Burns: So, once you have some data, let’s say you’ve done some testing, and this is maybe sort of beyond the Michigan Method. Well, whatever testing you did to start off, you figure out what audiences work. Maybe you got a couple look-alike audiences, maybe you got four or five different interest groupings.
  Once you actually figure that out after your initial testing data-set, which I definitely recommend people going back to Episode 71 because that’s a really methodical way in which to test, but it’s not good for everyone. It sort of depends on your budget. So, what would be the next level of scale and leverage to be able to get the result or the expected outcome that you really want with regard to website conversions? How would you handle it to the next step?
Drew Tweito: Once I’ve gotten an idea of like testing different placements, and audiences, and creative, I mean mainly I would be focused on learning what creative is working there. Whether you’re doing it through the Michigan Method or you’re using AdEspresso or some third-party software, once you get those learnings and you’re ready to go to phase two, phase two would be more about give Facebook room to work, let the club do the work.
  All right, so let’s say you’re doing a Michigan Method or some similar style, you have a ton of ad-sets targeting the same people and they’re all on low budgets. We know that the learning accrues at the ad-set level, like that’s where all of this is happening. If you have 50 ad-sets and you’re getting 50 conversions a day from those ad-sets, that means every single one of those things has only seen one conversion, one positive and a bunch zeros. You can imagine how am I supposed to derive a pattern, or like a prediction, or a solid prediction about who is going to convert next on just one positive.
Ralph Burns: Yeah, there’s not enough data. It will work for a while but there’s just not enough data for longevity.
Drew Tweito: Facebook is conned like all right, well, I got to spend this daily budget tomorrow. I still don’t really have a better idea than I did when I started about who to send this to so we’re just kind of randomly deliver it. When you get to a certain point there, you need to kind of take your learnings and then now let the club do the work.
  Okay, so in phase two, we’re going to move to a setup where we’re going to do less ad-sets. Instead of having a number of different ad-sets targeting the same people, we’re actually going to go almost the opposite way and we’re going to combine audiences into one ad-set. The reason we’re going to do this is like let’s say you have 10 lookalike audiences, and they’re US look-alike audiences, and it’s all for the same offer, and all that good stuff.
  From Facebook’s engineering point of view, the only reason to ever split those out into separate ad-sets would be if you’re going to bid differently for them. Next time when you’re going to want to campaign, you’re going to split out ad-sets. When you’re onto this scaling there, my challenge to you would be to think to yourself why am I splitting these out. Will I bid differently for them? If the answer is no then put them together.
  The two things that this is going to do for you here that are going to help you kind of swim with the current related to the auction system and the machine learning is that going from 20 ad-sets to one, if we’re going from 20 ad-sets in stage one where we’re testing to one, we’re going to consolidate all those conversions to one ad-set. We’ll call that like sometimes I say like the juice. You’re getting all the juice into one ad-set instead of 20. That’s so important, especially for website conversion campaigns where the conversion rates might not be as hot. The difference between one a day and 20 a day is going to make a huge difference.
  The other thing is, and this is the part that a lot of people will miss that’s really key is that if you have an ad-set with a tiny audience and let’s say you break interest up into different ad-sets, like one interest per ad-set, Facebook could be using that learning.
  Chances are those answers are going to be somehow related, right? Some people are going to be in one or more of those interest groups because all the learning is happening at the ad-set level and all the reach is controlled at the ad-set level. All these people that would have been great candidates compared to the conversions you’ve gotten from like one of your ad-sets, Facebook can’t even see them, unless you’re using … maybe if you’re using expanded interest.
  So, what we’re going to do there is like we’re going to have better reach and availability in the auction. If I go from 10, 100,000-person ad-sets up to the same group, a million people, your performance should improve drastically because it’s kind of like a compounding benefit. The consolidated juice gives us a better confidence interval and a better idea of expected conversion rate. Therefore, we can predict better who’s going to convert. It’s like Facebook can now see a whole stadium full of people instead of just being like in a high-school gymnasium, do you know what I mean? It’s like now they can apply a probability to everybody.
  Remember with a daily budget, they got to try to spend that money regardless of whose online especially if you’re auto-bidding. This gives you the best shot of your highest expected value, people being even in the auction, like even showing up that day, and us being able to reach them.
Molly Pittman: Andrew, do you believe that even by audience size, like do you think if the audience is too big, you should split that into multiple ad-sets or do you think audience size does not matter?
Drew Tweito: I don’t know the 100% answer to that. I don’t know for whatever reason recently, and it might just be because of the competition in the auction and the competition on Facebook ads, in general, has gotten a lot tougher, that we have seen bigger audiences working.
Molly Pittman: I feel the same way. It’s easier, like I feel like bigger audiences are working better, especially if you really have a conversion that’s been happening frequently, which is nice. I think that goes to show that the algorithm has gotten a lot smarter.
Keith Krance: 100% agree there.
Molly Pittman: Hundy.
Keith Krance: We’ve got some testing where had to scale as fast as possible up to $50,000 a day. Actually, Terry, who’s one of our coaches in our Navigator Coaching Group did a webinar about that because I had to scale to 50,000 a day in like three days. They did it with no targeting because they already had like two to 300 conversions and so they used really big audiences. We’ve seen it in several cases like Ralph, and then there’s some of the best campaigns are 30 to 40 million and running for a long time.
Ralph Burns: Oh, yeah, absolutely. Yeah, that just shows the power of the algorithm I think. We’ve done it plenty of times where we’ve done it sort of by mistake even in testing mode where the interest for whatever reason didn’t catch and-
Keith Krance: Delete.
Ralph Burns: The power editor just deleted it for whatever reason, and it’s wide open to 130 million in the US, and it’s the best performing ad-set. Notice we’re saying ad-set here because, I mean, there’s a lot of data that shows that this machine learning happens via ad-set. It is by that 25 conversions in a seven-day period but you want more than that, as many as you can get, which is the reason why level two of scaling here is obviously to larger audiences, larger budgets so you can feed the algorithm positive data and let that machine learning work for you as opposed to you working against it. A lot of times this means not doing a whole lot, right Andrew? It means like letting the algorithm do the work while you guys kind of sit back and be patient.
Keith Krance: Or get your messaging right first, put most of your energy there.
Ralph Burns: That’s what we test in the Michigan Method. You can’t do this right out of the gate.
Molly Pittman: You have to have a good offer. Everything has to be pretty perfect.
Keith Krance: You got to test this stuff upfront first. You can’t go right into this. You’re missing a whole step because you can’t go $500 a day, I don’t know what’s going to work. You just don’t know what’s going to work. You have to test smaller and then get larger. You don’t know what a good message is unless you actually test it to traffic.
Ralph Burns: Yeah, exactly, yeah. It’s kind of like surfing. Like, let’s say you’re trying to surf and you’re trying to paddle into a wave. If you’re not in the right spot, if you don’t time it right or if you’re in like just a little bit too far off the shore, you’re just going to be paddling all day long. You’re never going to be catching any waves because there’s so much power into that wave.
  What happens is there’s a lot of people are out there focusing all their time and energy in optimizing, and testing, and all this stuff. Instead, what they should be doing is in focusing … like I spent two days writing an ad and it usually works right out of the gate with zero testing. Now, of course, the more we add to that, the better it does as far as testing, but if you actually put the work in to creating offers that convert, good messaging, then sometimes it’s like are you stepping over a dollar to get to a dime.
  If you’re an entrepreneur or if somebody on your team, you have to think about a little bit where is your best times spent. Now, if you’re an agency, you have a client like DWM Agency where in order to be a client, you’ve got to have a high converting funnel and you got to have a lot of that stuff in, right, where you’re just focusing on running and optimizing the Facebook ads as well as giving the guidance on strategy and messaging, then you’re going to be able to spend more time, right, Ralph, like testing and stuff? But, if you’re let’s say you’re a solopreneur or let’s say you’re a marketing director and you’ve got other departments, sometimes you have to weigh those things.
  I think the key is to understand how it works, understand how it works. If Facebook will work better with large audiences, it’s going to work way, way better if you have a great message and a great ad-copy. The fastest way to know which one of your messages works is to test those.
Drew Tweito: That is the other big key point here is like if we can help from some of this information help you simplify your approach and spend a little less time on Ads Manager, which might sound counter intuitive, you can focus that time on things that I think have bigger levers. I think everyone would agree is like working on your follow, working on-
Molly Pittman: Your copy, your messaging, your creative… everything, your offer, your backend, creating new products to increase customer value. That’s exactly how I look at it. I’m not in the Ads Manager a lot. We’ll do testing, and then once our ads are set up and they’re in a good place, but I’ll get in there every few days and kind of check on how things are looking and make tweaks.

 

 

  Facebook is not a day trading platform. You said that earlier, Andrew, and I thought that was really good. It’s built to be easier to use. They want people to spend more money. They want us to spend more money. If they can make the platform better, and easier, and less maintenance, then they’re going to make more money and advertisers are going to be happy too.
  This shouldn’t be complete rocket science. It’s so good to know how it works because the better you understand it, the better you’re going to be as an advertiser. But, on the other side of that realizing that this is built to help you. You’re not working against the platform. Once you find something that works, you shouldn’t be making tweaks every 15 minutes because it’s actually going to hurt you. You’re not giving Facebook enough time to do what it’s good at.
Ralph Burns: Absolutely. I think that a takeaway from this is that once you’ve dialed in that message, which is certainly really important to do. If the greatest copywriters, they can get it right, right out of the gate. If you really know your audience, you might be able to get it right out of the gate.
  In most cases, we don’t get it right, right out of the gate. We fail a lot. We fail 70% of the time or 80% of the time if we’re not having a very good day. The point is that once you have that, then start thinking about how you can do less to get more in the algorithm by leveraging larger audiences, combining those audiences, larger daily budgets, and then even doing some bidding manual optimization, which we can even get a little bit into that here.
  The point is that inactivity in an ad-set, or in a campaign, and customer clients, and one particular last two weeks, we didn’t actually pause an ad-set that had spent money over a two-day period. I explained to them inactivity in the ad-set and in the campaign does not mean inaction. Inaction means we are looking at it and we’re making a strategic decision to let the algorithms smooth out those inequalities, those ups and downs, because ultimately, we’ll win.
Molly Pittman: Yeah, I’ll see this in DigitalMarketer’s Engage a lot. I’ll see our customers and they’ll post. It’s like, “Launched an ad an hour ago. Only 15 people have seen it. What’s wrong?” We don’t even look at our ads for three days at least, sometimes five because you’re going to start making changes off of something that’s not… It hasn’t had time to do what it’s supposed to do.
Ralph Burns: People want to meddle with their ads because it’s money.
Molly Pittman: They want to tinker.
Ralph Burns: It’s emotional, it’s money that’s being spent. If you’re spending thousands of dollars a day, if you see inaction, it might actually mean that your agency or your ads manager, whoever it is, or yourself is asleep at the switch, but that’s not necessarily the case. What we’re doing is we’re trying to let the algorithm do the work here because the algorithm is smarter than all of us.
Keith Krance: We had some guys, one of our clients came on and shared, really, they kind of built their own algorithm in our navigator group. I’m not going to get into it, but basically, they check things out for three days and they… except in extreme cases typically nothing gets turned off until after 14 days.
Drew Tweito: Oh, yeah.
Keith Krance: One thing I want to add here just to pay close attention to, notice that Andrew with a data science background talking about really kind of simplifying, right, and letting the algorithm do the work. Just remember that one thing I think so-
Ralph Burns: A data scientist simplifying things.
Keith Krance: All right, awesome stuff here, Andrew. Is there anything, this is a lot for this episode. We’re going to have to have you on as quickly as possible again, but is there anything else you’d like to add? Are there any resources you can give people listening right now?
Drew Tweito: A couple of key takeaways would be number one, remember that, as Molly was saying, it’s not a day-trading platform. Every time you go in there and you mess around with the settings and the ad-set or change something, it actually has to recalibrate. It is actually really hurting you. It’s not just that you’re wasting time but it is hurting your ad-set performance if you’re doing that. Do less, that would-be number one.
  Number two, yeah, let the club do the work like we talked about, test it out. I will say that a lot of people who are maybe from the older school of Facebook advertising when they hear this they’ll say, “Well, I’ve tried that. I’ve done that before. It doesn’t work. You can’t do big ad sets, big budgets.”
  Well, there’s another phase to this and we’ll probably talk about it on another episode but you can use everything we talked about today going with bigger ad sets. That’s what we’re doing right now for some clients with automatic bidding. When you’re starting to get up really high on the daily budgets, there is another step to avoid getting your budget blown out and we’ll give you more information because that’s kind of a whole other story.
  That’s the other thing, if you’re listening to this and you’re super skeptical, you think we don’t know what we’re talking about, it’s not entirely impossible that I don’t know what I’m talking about but this is a newer take on an old strategy I will say.
  Okay, then yeah, if you want a visual walkthrough of the stuff we talked about here and then also a checklist of kind of best practices to work with the auction instead of against it, you can go to funnelboom.com/bid. B-I-D.
Molly Pittman: B-I-D.
Drew Tweito: Boom, baby, boom.
Molly Pittman: Thank you so much for coming on. You are brilliant and humble, and those are a good match. We appreciate you.
Ralph Burns: Well said, Molly.
Drew Tweito: Thank you, guys, appreciate it.
Molly Pittman: Love you.
Ralph Burns: See you.
Keith Krance: Love you all.

Thanks so much for joining us this week. Want to subscribe to Perpetual Traffic? Have some feedback you’d like to share? Connect with us on iTunes!

iTunes not your thing? Find us on Stitcher or at DigitalMarketer.com/podcast.