Before you start: what do you know about SEO split-testing? If you’re unfamiliar with the principles of statistical SEO split-testing and how SplitSignal works, we’re suggesting you start here or request a demo of SplitSignal.
First, we asked our Twitter followers to vote:
The test was POSITIVE. Our followers guessed it right (follow us on Twitter to vote next time).
Content freshness has been a hot topic in SEO since Google rolled out its “fresh results” algorithm update in 2011. OrangeValley and SplitSignal regularly test the impact of content freshness for both Google and users. The benefit of fresh content varies by search query, as different searches have different freshness needs. Looking at the Google Search Quality Rating Guidelines (October 19, 2021), we know that for product queries, users will want mostly current content, assuming they’re seeking information about the most recent model/version for these product queries.
We wanted to validate whether signals of content freshness were a factor (for Google and/or users) for the category pages of a major online furniture retailer in the US.
Before moving on to the case study, let’s take a look at what content freshness means. Fresh content is content that has been recently updated, regularly updated, or recently published. Google distinguishes three fresh content types for searches:
- Recent events or hot topics. When users are looking for trending or news content.
- Regularly recurring events. When users are looking for recurring events, such as sporting events or weather information.
- Frequent updates. When users search for information that changes often but is not really a hot topic or a recurring event, such as product/service research.
The website in question has a lot of regularly updated content (product listings) on its category pages. We wanted to validate whether adding ‘updated’ and the current year (updated 2022) to the title page would have a positive effect on organic traffic to the category pages and if it would be considered a ‘fresh content’ signal for Google.
So, we used SplitSignal to set up and analyze the test. 985 category pages were selected as either variant or control. We kicked off the test and ran it for 21 days. We were able to determine that Googlebot visited 97% of the tested pages.
The above image shows the development/progress of the variant (orange line) compared to the modeled control group (blue line). We saw that the traffic to the variant pages performed better than predicted, which means the test is positive.
Note that we are not comparing the actual control group pages to our variant pages but rather a forecast based on historical data. We compare this with the actual data. We use a set of control pages to give the model context for trends and external influences. If something else changes during our test (e.g., seasonality), the model will detect and take it into account. By filtering these external factors, we gain insight into what the impact of an SEO change really is.
The cumulative view not only shows additional or lost organic traffic to the tested pages but also shows that the test performed is significant. When all three curves perform below (negative) or above (positive) the y=0 (cumulative gradient) axis, the test is statistically significant. That means we can be sure that the decrease we are seeing is due to the change we made and not due to other (external) factors.
After running the test for 21 days, we saw a 4.9% increase in organic clicks to the tested pages, with a confidence level of 99%.
As mentioned at the beginning, when Google knows that users are looking for fresh content, it can play a factor in the ranking of the search results. With this test, we wanted to validate if updating the page title with signals of fresh content would play a factor for Google and/or users; analyzing the data in detail gives us the answer.
Analysis of the data shows that this test impacted the click-through rate (CTR) to the variant pages. Compared to our modeled control group, the rankings and impressions even decreased slightly. The increase in clicks seems to be purely due to the behavior of Google users. Including the year the product listings were updated on the category pages makes for a good CTA, but we don’t think it’s a signal of “fresh” content for Google. Google actually seemed to devalue the variant pages, meaning that in Google’s eyes, the results were probably less relevant to the query. One reason for this could be that each page title starts with the same CTA (Updated 2022) and not the primary and unique keyword(s) the page is associated with.
This test shows that what would be considered a freshness signal (or a good CTA) to users doesn’t mean this is also true for Google. You won’t be able to “fool” Google by simply updating a title. But perhaps more important this test also shows that knowing what users want/expect (search intent) and responding accordingly is just as important.
We know from experience that the results of a comparable test can differ between websites. Given the impact it can have, it’s a good idea to test it before rolling it out on your website. In this case, we are sure that enriching search results with freshness signals (for users) is an excellent idea.
Have your next SEO split-test analyzed by OrangeValley Agency.