Did you see the Google announcement that they don’t use rel=prev and rel=next?
The SEO community is in an uproar around not being told that this signal has not been used by Google for YEARS without telling us.
If you’re just catching up, here’s the Google announcement:
And here’s a screenshot from their blog post about how to implement rel=prev/next from 2011:
And John Mueller’s follow-up clarification:
So Google now recommends a single page of content and NOT breaking that content into multiple pages, but I think it’s important to take this advice with a grain of salt and make sure that you are doing what’s best for your users.
Google also says that it can figure out multi-page sets of content if you link between them in a way in which it’s clear to users.
While it is frustrating that we’ve been recommending a Google best practice (that Google has not been using and did not tell us they were not using), I still think you might not have wasted your time implementing it.
Here’s my take on the situation:
Bing still uses the signal
It is still used by Bing. See Bing’s response here:
Keep in mind that Bing is used for all voice search that is not Google Home. So if think your audience is comprised of searchers that are heavy uses of voice search, you might want to keep the rel=prev/next to help Bing with crawl budget issues and understanding your site so that you can rank more effectively in their Featured snippets (and voice search queries).
It’s still used for accessibility/ADA compliance
The standard is still an accessibility standard, so if your audience uses screen readers (or you’re a .gov and need to be 508 compliant) it is best to use the coding.
Google’s not great at “figuring it out”
The statement from Google was that they were “pretty good at figuring it out”, but I don’t see that the case on large enterprise sites where you’re tackling crawl issues. I think it’s worth giving Google all the signals you can to control your crawl budget. Especially if you’re in the news space and speed to indexing is an issue. Whether you use rel=prev/next or not, you need to have clear link signals between the pages.
And a full one-page answer sometimes is NOT the best option when it creates a large HTML page that Google is not able to fully render. Most studies show that any page over 2 MB will throw errors. Deepcrawl sets their “Max HTML size” at 204,800 bytes – or a bit less than 2 MB.
This is clearly becoming a lower priority for implementation
That said, this particular recommendation is dropping in priority for me across all of my audits (except .gov) unless we get different guidance from Google. Unless I see an enterprise site having tons of issues with low-quality pagination being indexed that is not being controlled via traditional parameter control measures (URL parameter report inside GSC, robots.txt directives).
You might still need to strengthen internal link signals to aid in URL discovery
This does not mean that without the pagination structure there will not be other issues related to discovery – if you have high-quality content buried deep in a pagination sequence that you want to perform better, I would recommend looking at a better internal link structure and potentially rel=prev and next to assist.
Don’t bother taking it down if it’s in place
I certainly would not waste the time to take down any rel=prev/next you currently have in place. I just would in most situations make it a lower priority to put it in place for most clients.
What do you think? I’d love to hear your thoughts in the comments!
Ane Skovsted says
Hi Katherine,
Thank you so much for a splendid overview of what to do in regards to rel=prev/next. I agree that is it essential to not trust Google’s ability to “figure things out”. And it is certainly important to have strong linking signals internally and not bury important content deep in paginated pages with few internal links. How do you canonicalize on e.g. ecommerce sites when having large pagination series let’s say maybe up to 10 or more versions of a product listing page. Do you have a self-referencing rel=canonical for the first page in the series (page #1) or would that cut off linking signals for #2 etc.? In my head it’s a bit contradictory to both avoid duplicate content and keep a strong PageRank flow down to a product page that you want to rank at the top. You can prioritize and place the top products in the top, but with a huge product catalogue it may not be easy. What am I missing?
Katherine Watier-Ong says
Google just released new e-commerce pagination guidelines, which follow with what I’ve been recommending – every page should have a self-referencing canonical tag, and make sure that each paginated page has its own unique URL. A canonical tag does not stop the flow of page rank, it’s just to help disambiguate the paginated pages from each other (helping Google recognize that each is its own page that could rank on its own, and not duplicate content. Related to which type of pagination generates the most crawling, I always recommend looking at this study which was covered at Tech SEO Boost a few years ago: https://audisto.com/guides/pagination/