In a previous blog post, I explained what responsive search ads (RSAs) are compared to expanded text ads (ETAs) and how you can implement them in your Google Ads account. I also wrote about Google’s claims about RSA performance: higher clickthrough rate, higher conversion rate, lower cost per click. But that got me to thinking: Is that even true?
So, I decided to test those claims by taking a look at the data available to me from two of Portent’s largest paid search clients. In this blog post, I’ll share that data with you and try to paint a clearer picture of RSA performance. If you want a more 101-level overview of responsive search ads and how to use them, I recommend checking out my previous blog first.
Before we dive into the data, I want to quickly go over Google’s claims about responsive search ads. With RSAs, Google says that it may help you generate more conversions because your ads will be eligible for more auctions. It also claims that it can help your Quality Score by improving your clickthrough rate, which in turn lowers your cost per click. While they don’t guarantee performance by any means, it’s a bold claim nonetheless. Any feature that can potentially boost performance in all KPIs is obviously a must-have, but you need to be sure to test the performance to see if RSAs perform well for your specific account.
Now that we’ve quickly recapped the potential performance of adding responsive search ads to your account, let’s take a look at the data.
To get the most reliable data possible, we looked at two of the highest spending accounts that Portent manages, one in the outdoor apparel industry (I’ll call them Client One) and another in the HVAC industry (I’ll call them Client Two). We pulled all search data from a six month period.
Once we got the most relevant data we could find, we had to figure out exactly what metrics we wanted to focus on. To most accurately determine the performance of RSAs vs. ETAs, we chose to focus on Clickthrough Rate (CTR), Cost per Click (CPC), and Conversion Rate (CVR). We chose conversion rate instead of flat conversions to ensure that the data wouldn’t be skewed by one type of ad receiving more traffic than the other.
Now without further ado, it’s time to get into the results.
Let’s start with CTR:
Of the three metrics we looked at, CTR was easily the most confusing metric to analyze. For Client One, responsive search ads performed 24.5% better than expanded text ads did. If you were to look at just this account, you would see RSAs as a slam dunk when it comes to improvements, but unfortunately, we’re not just looking at one account.
When looking at Client Two, the takeaways for CTR become a lot murkier. As you can see, we actually saw a very slight decrease in CTR compared to traditional expanded text ads. It’s only a 3% decrease, but it still calls into question the overall effectiveness of responsive search ads.
Let’s look at CPC next:
I’d say that’s pretty inconclusive evidence. Despite having extremely large sets of data over a six-month period, the cost per click managed to be exactly the same for both Client One and Client Two.
This isn’t to say that RSAs don’t help cost per click, because it might. In these two accounts, however, we didn’t see any evidence that CPC improved due to the use of responsive search ads. If I were to run this test again, I would want to try it in a brand new campaign that hasn’t run either RSAs or ETAs. The fact that ETAs were running in the account long before the RSAs means they could’ve had a chance to improve their CPC over a long period of time.
Finally, let’s look at CVR:
Now that’s what I’m talking about! For both Client One and Client Two, we saw substantial increases in our overall conversion rate when using responsive search ads compared to traditional expanded text ads (40.7% and 13%, respectively). RSAs clearly helped generate more conversions than traditional ads, which at the end of the day is what you’re looking for in a paid search account. This is likely due to the larger number of headlines and descriptions, coupled with Google optimizing for conversions from our bidding strategies.
Despite not showing any evidence of improving CTR or CPC, responsive search ads helped these two clients see a significant increase in conversion rates over a six-month period. And while CTR and CPC are important, they don’t mean anything if you’re not turning those clicks into conversions. So I would call this experiment an unequivocal win for responsive search ads.
Don’t take my word for it, though! Every account is unique, and will likely see different results by running the same test. I know multiple clients have tested RSAs that haven’t performed as well as expected due to a variety of reasons. So run your own tests, draw your own conclusions, and do what’s best for your company or your client. Happy testing!
The post Expanded Text Ads vs. Responsive Search Ads: A Performance Comparison appeared first on Portent.