Your entire website matters
Today I’m going to talk through all of the ways Google looks at your ENTIRE website when it determines what type of search equity your site can expect. I often work with clients where different teams manage different parts of their website, and this tip is for them.
Hello! Thanks for listening to SEO tips today.
I was inspired to create this tip based on Google’s John Mueller’s comment that slow URLs can potentially impact the ranking of other faster URLs on your site.
If all of your slow URLs are on a subdomain, it might just group those URLs.
Also, Google has recently announced that Noindexed Pages Can Impact Core Web Vitals. Frankly, this was my WTF for the month. If you block the page from indexing, clearly you don’t want it included in evaluating your site’s ranking? All the more reason to not generate and try to orphan those pages when possible, so hopefully, Googlebot won’t find them in the first place.
Google has referenced other ways that parts of your site can impact the overall ranking of your site, like:
- Server errors, whether found on your main domain or subdomain. If found on your primary domain, or If all of those subdomains are on the same server, and a subdomain has server errors, Googlebot will back off so that it doesn’t tank your overall site.
- Doorway pages – if Google sees a high number of orphaned pages, that might be what they think you are doing, which could generate a penalty.
- A high number of broken internal links, as it’s a sign of a low-quality website.
- Large amounts of thin content. The Panda algorithm is domain-wide. Google source.
- You are linking to Penalized Sites. This is the “Bad Neighborhood” algorithm. Matt Cutts has said: “Google trusts sites less when they link to spammy sites or bad neighborhoods.” Now has suggested using the rel=”no follow” attribute if you must link to such a site because “Using no-follow disassociates you with that neighborhood.”
- Crawl traps – If Googlebot gets caught on one part of your site, they often won’t reach all of your site’s pages. I’ve fixed crawl traps on sites that resulted in ranking lifts across the entire site.
- Soft Error Pages. Google could treat “soft 404” pages as low-quality pages on your site, lowering your overall domain’s search equity.
- Unmaintained subdomains that are on the same server as your primary domain. The low maintained websites *could* present a liability if Google sees them as being part of the main domain. Here’s the guidance from the Google Human Rater guidelines around web maintenance:
- “Some websites are not maintained or cared for at all by their webmaster. These “abandoned” websites will fail to achieve their purpose over time, as content becomes stale or website functionality ceases to work on new browser versions. Unmaintained websites should be rated Lowest if they fail to achieve their purpose due to the lack of maintenance. Unmaintained websites may also become hacked, defaced, or spammed…”
This is not an exhaustive list but highlights how neglected parts of your overall website can impact its ranking and search equity.
Thanks for listening. Come back tomorrow for another SEO tip.
Listen to the previous episode: New Free Tools inside Bing and Google search accounts
Subscribe and listen on your favorite podcast app: Apple Podcasts | Google Podcasts | Spotify | Spreaker | iHeartRadio | Castbox | Deezer | Podcast Addict | Podchaser
Leave a Reply