Are you checking the 200 status URLs in your log files?
Hello, thanks for listening to SEO tips today.
This tip is based on an issue I’ve now seen across two clients by looking at their log files.
If you’re working on a large site, maintaining your crawl budget and ensuring that the bots do not spend time in crawl traps or on low-quality pages is really important.
For this one site, I got the development team to implement my recommendations and eliminate a crawl trap where parameter URLs were being generated with a 200 status based on a relative link error (and that error URL did not generate a 404 page with a 4004 status). Once the fix was in place, and the bots were not stuck in that trap, the site gained 1,000 top Google results within a few weeks.
So back to the log files – look at the 200 status URLs, and if any of them look bizarre, then open them up in your browser and check to make sure that they should have a 200 status. I’ve seen what should have been 404 pages and 500 pages loading as 200 status URLs and these were top URLs with the most bot hits in the log report.
So that’s your tip. Check your 200 status URLs in your log files and make sure that they should be a 200 status.
Thanks for listening. Come back tomorrow for another SEO tip.