Did you see the Google announcement that they don’t use rel=prev and rel=next?
The SEO community is in an uproar around not being told that this signal has not been used by Google for YEARS without telling us.
And here’s a screenshot from their blog post about how to implement rel=prev/next from 2011:
So Google now recommends a single page of content and NOT breaking that content into multiple pages, but I think it’s important to take this advice with a grain of salt and make sure that you are doing what’s best for your users.
Google also says that it can figure out multi-page sets of content if you link between them in a way in which it’s clear to users.
While it is frustrating that we’ve been recommending a Google best practice (that Google has not been using and did not tell us they were not using), I still think you might not have wasted your time implementing it.
Here’s my take on the situation:
Bing still uses the signal
Keep in mind that Bing is used for all voice search that is not Google Home. So if think your audience is comprised of searchers that are heavy uses of voice search, you might want to keep the rel=prev/next to help Bing with crawl budget issues and understanding your site so that you can rank more effectively in their Featured snippets (and voice search queries).
It’s still used for accessibility/ADA compliance
The standard is still an accessibility standard, so if your audience uses screen readers (or you’re a .gov and need to be 508 compliant) it is best to use the coding.
Google’s not great at “figuring it out”
The statement from Google was that they were “pretty good at figuring it out”, but I don’t see that the case on large enterprise sites where you’re tackling crawl issues. I think it’s worth giving Google all the signals you can to control your crawl budget. Especially if you’re in the news space and speed to indexing is an issue. Whether you use rel=prev/next or not, you need to have clear link signals between the pages.
And a full one-page answer sometimes is NOT the best option when it creates a large HTML page that Google is not able to fully render. Most studies show that any page over 2 MB will throw errors. Deepcrawl sets their “Max HTML size” at 204,800 bytes – or a bit less than 2 MB.
This is clearly becoming a lower priority for implementation
That said, this particular recommendation is dropping in priority for me across all of my audits (except .gov) unless we get different guidance from Google. Unless I see an enterprise site having tons of issues with low-quality pagination being indexed that is not being controlled via traditional parameter control measures (URL parameter report inside GSC, robots.txt directives).
You might still need to strengthen internal link signals to aid in URL discovery
This does not mean that without the pagination structure there will not be other issues related to discovery – if you have high-quality content buried deep in a pagination sequence that you want to perform better, I would recommend looking at a better internal link structure and potentially rel=prev and next to assist.
Don’t bother taking it down if it’s in place
I certainly would not waste the time to take down any rel=prev/next you currently have in place. I just would in most situations make it a lower priority to put it in place for most clients.
What do you think? I’d love to hear your thoughts in the comments!