Disclaimer: Some of the links below are affiliate links, meaning, at no additional cost to you, WO Strategies LLC will earn a commission if you click through and make a purchase.
I don’t know about you, but 2018 was quite a year. On both the work and home front, the pace was intense, sleep was limited, and I felt stretched most of the time.
But, I also learned a ton this year – both about running a business and SEO. And I wanted to share those personal “aha” moments with you in case they can help you in your career or business.
Here they are the top 11 things I learned about business and SEO in 2018:
1. No matter your industry – even if you’re a solo entrepreneur, you need help. Ask for it.
This was a year where I needed to remind myself that I not only need to take time for myself occasionally, but I also need to ask for help when I need it. Both of these are critical to maintaining business consistency and growth.
I started with a virtual assistant (VA) when my second daughter was born. I knew that coming off maternity leave that I would need an assist to double check my writing due to impending sleep deprivation if nothing else. That VA came in really handy when the same daughter was hospitalized this year for several days with breathing issues and my busy week had to be completely rescheduled. I could focus on my daughter and her healing, knowing my VA would handle what she could to keep the boat afloat until I could retake the helm. I now have two virtual assistants helping me keep things moving — freeing me up to write, speak and work on client strategy more.
2. Be as detailed as possible with your recommendations
When providing technical SEO recommendations (or really any recommendations) to developers, it’s important to be really clear about what the ideal outcome is for related Q&A tests.
I saw one case this year where the developers took liberties with deleting indexed pages – or adding the “noindex” meta tag to them – because of our recommendation to improve their thin content. These were pages that were already ranking for top terms. Luckily, we caught it before huge traffic losses, but my recommendations in the future will certainly make those kinds of things more clear to avoid confusion between the necessary teams to make the magic happen.
On the flip side, I also developed a video SEO guide this year that I think is much more prescriptive.
3. Robots.txt files need to be at the root level of your website – and can not redirect
It turns out that if robot.txt files redirect or are located anywhere but at the root level, Google will ignore them. During an audit earlier this year, I saw a client had a robots.txt in a subdirectory. Based on their Google Search Console URL parameter report, Google was hitting all of the parameters and file folders that should have been blocked in their Robots.txt.
Here’s a tweet from Gary Illyes confirming that Google ignores a robots.txt that redirects:
that's ignored afaik, it's treated as no robots.txt
— Gary 鯨理/경리 Illyes (@methode) February 16, 2017
In this particular case, the client worked hard to set up their robots.txt to control the Google crawl – ensuring that Google was getting to the good content, ignoring the duplicative content and parameters, and not overloading their server, but all of those commands were ignored because of the redirect.
4. “Old school” HTML markup is important.
When doing some research for a client, I saw a competitor grab the Featured Snippet slot because the “answer” was wrapped in an H2 tag midway down the page. So, the takeaway is to make sure your writers use standard HTML formatting in their online writing.
Here’s an example:
First off, Google knows that “breast cancer causes” is the same as “risk factors.” You get that hint by looking at the first question under the People also ask section.
Secondly, that Featured Snippet is pulled from this page, which is labeled “causes of breast cancer. ” That particular snippet is pulled from a paragraph deep in the page (you need to scroll three times to see it). It looks like this:
Google is pulling it as a featured snippet because the header “Known Risk Factors” is an H2 tag.
5. You can tell Google that your old content is not updated to save crawl budget.
This is my favorite tip this year. I picked it up from here:
If you have old content that for various reasons you need to keep online AND need to keep indexed, consider a 304 – Content not updated. That HTML status can save your crawl budget.
It makes your server tell Googlebot:
“I have your request, but the information you want hasn’t changed since the last time you asked for it. Let’s not waste time; just get the file you downloaded last time.”
More from Google about using 304s here.
6. Rankings are different for high traffic head terms
I am lucky enough to be contracting for some large enterprise sites, and they rank for high volume keyword terms against other large, authoritative domains.
I’ve seen this phenomena live – that ranking when you’re at that level is just…different. The old ranking signals are not as important as the ones driven by RankBrain – pogo sticking, on page UX, etc.
This research was a real eye-opener for me when troubleshooting how to improve rankings when you’re playing ball at that level.
7. It is possible to create sub, sub, subdomains that are recognized by Google.
Mind you, I think these are more difficult for overall web maintenance and the user remembering, but I spotted these examples in the wild this year:
Https://response.restoration.noaa.gov and this one: https://www.galvestonlab.sefsc.noaa.gov/
8. Google is working quickly to roll out a search experience that mirrors the user’s journey.
This will highly influence your searcher’s click from SERPs, I think for the better. Here are some real examples I saw in recent history:
Google testing new SERPs for medical terms? A hint of what the searcher's journey might look like? #SEO. Notice the categorization at the top…you can just tab deeper into the topics you're interested in. pic.twitter.com/BZtHZJv0if
— Katherine Watier Ong (@kwatier) December 2, 2018
If you haven’t mapped your customer journey to your keyword research, or made sure your topics and subtopics on your site match a known ontology about your topic, then this might be the year to think through that transition. Resolve to put a plan in action if you want to retain your search traffic. Pro tip: if you’re looking for a keyword tool that will allow you to export your keywords already grouped into subtopics, check out SEMRush’s Keyword Magic Tool.
9. You can use Amazon service workers to implement SEO recommendations to bypass your developers
I am personally fond of developers. My brother is one. But sometimes it’s hard to get your SEO recommendations implemented by the development team. If you can’t get your developers to prioritize your fixes, you might want to check out this presentation from Dan Taylor at the SALT Agency won him the Technical SEO Research award.
Here’s more about using Cloudflare to manage your redirects and here about how to use Cloudflare workers to manage your Hreflang tags if you can’t get your dev team to help you.
And, if you have an A+ developer team who tirelessly works behind the scenes to support your SEO plan, make sure to let them know this year how much you appreciate their hard work.
10. Making it easier for Googlebot to discover your URLs can quickly lift traffic:
A picture is worth 1,000 words. By adjusting just two variables (homepage speed and URL discovery), we saw this VERY positive change in just three months with a recent client:
You can read the whole SEO case study here.
11. Automating your reporting can free up resources to help with your SEO efforts.
I’ll admit that this isn’t a new insight for me this year, but it was really fulfilling to help one of my clients move from manual Excel reporting that wasn’t focused on their SEO goals to an automated solution.
We not only cleaned up their Google Analytics installation (their data was misleading them) but we also helped them clarify their digital marketing goals and put in place a Google Data Studio set of dashboards that freed them up to ANALYZE the data accurately. This allowed them to make strategic digital marketing changes based on accurate data (the best kind!), vs spending time manually pulling reports together that were riddled with errors.
If you’re interested in clarifying your SEO goals and setting up an automated dashboard, you should check out my posts about how to set up your website goals and how to set up an actionable Google Data Studio dashboard.
So what did you learn about SEO this year? And what’s on your learning list for 2019?
I’m focused on learning more about machine learning and Power BI.
I’d love to hear about what’s on your list. Add them to the comments or drop me a note!
md rasel says
This a great article thanks for sharing.
anaya says
thank you for this article this helps me a lot