Search and social news and what I learned about SEO last month…
Over the last year I’ve been sharing SEO and social media news with you monthly and I’ve decided to change the format a bit.
In addition to sharing with you what I believe are the news announcements, research findings, and new tactics of greatest importance over the last month that should be bookmarked and read, I’ve also decided to share with you what I’ve learned personally over the past month related to digital marketing.
So here we go:
Ontologies and the Topic Layer
If you’re not familiar, ontology is a formal mapping that represents domain knowledge as a set of concepts and relationships between those concepts. In the field of AI, ontologies have been applied as artifacts to represent human knowledge and Google uses them to create and update their Knowledge Graph.
I’ve been working quite a bit with one of my clients to help them think through a digital marketing strategy that will stretch them to be ranking well 2-3 years from now. As a part of that effort, I’ve been focused on learning more about how ontologies are used in the Knowledge Graph while digging into how Google might be generating the filter bubbles that are seen in Knowledge panels as well as top navigation in search. The latter has only been seen as a beta test.
Here are my initial thoughts related to how to use ontologies and markup in your SEO strategy:
It seems clear that Google has been using its Knowledge Graph and its ability to crawl ontologies to develop a sense of major topics and subtopics.
Additionally, (and h/t to Bill Slawski who I caught up with this last month and pointed out this patent to me and then I found this patent as well) Google has also been using clickstream data to refine those topics and create new subtopics that might not fit within the existing ontologies.
By marking up the entities in your copy (and aligning them with a known ontology) it helps Google more quickly understand your content. For voice search, Google pings the Knowledge Graph. With schema and entity markup you can enhance your ability to:
- appear in voice search
- appear in advanced Google features
- overall have better and broader findability
Additionally, to make sure that your content fully matches the new subtopics that Google is creating based on clickstream data, you should do user journey and keyword research to see if you can discover topics that might fit into the existing content that you might not have covered.
Ahrefs Tip:
I have been using Ahrefs to conduct competitive analysis over large datasets, and I have discovered that it becomes really challenging to use the monthly SEO subscription-based tools to conduct competitive analysis for large sites. All of the monthly subscription tools have limitations around how much data you can track from a ranking perspective, AND how much data you can export from the system. So I had to figure out creative ways to use the tools.
One of the items I was trying to figure out for this competitive analysis was how to pull how many URLs are ranking for each competitor for the target keyword. This was to get a sense of how much effort the competitor has put into creating content for the topic.
Here’s the process I figured out/followed:
- Put the URL main domain into Ahrefs and go to the “Organic keywords” report.
- Filter the report to include any target keywords that had the main keyword in them.
- Pull only position 1-11 (Google page one-ish, if the SERP has a Knowledge panel it’s tracked as #11).
- Export the sheet and then deduped by URL and then run a count of the rows. Viola!
In addition to the SEO items that I personally picked up this month, here are the other research studies, Google announcements, and tips that I think are worth sharing:
The Search and Social Market
There is a great new data released by SparkToro that you should take a look at – especially if you have a brand that has an online European footprint. First, look at the dominance of Google-owned properties in the US vs the EU:
96% of all searches in the EU+UK are Google-owned searches.
Now look at the number of no-click searches in the US vs EU+UK:
The report even provides a full google CTR by country that you can download and use.
The big takeaway is that Google is dominant and yet organic search is slowly being cannibalized around the world as Google rolls out more instant answers, Knowledge panels with no clicks and paid ad slots in the SERPs.
It’s unknown as of yet whether the decline in clicks is the result of voice search usage (and in the instant answer that supports them) but that’s one theory.
Buffer State of Social 2019 Report
Here are some relevant stats from Buffer’s State of Social 2019 report:
- 14.5% of businesses don’t publish ANY video content. Consider this: YouTube is the second most popular search engine. 92% of the videos that appear in Google search are is YouTube videos. More people watch videos on their devices in the 18-34 age group than they do on TV. So, NOT creating video content seems like a missed opportunity. And yet, 14.5% of businesses are missing that opportunity.
- Related: It seems that more businesses post on Twitter and Instagram than YouTube, which is great, but with the above stats, video is where search is going and not having videos is a huge a missed opportunity.
- 81.2% of the survey respondents publish on Facebook first. This is also interesting considering the impact YouTube has on SEO (see my thoughts below about reputation as a ranking factor).
- Did you know that videos on LinkedIn get shared 20x more than other content formats? This provides more evidence that video is an incredibly effective tactic for your overall marketing plan.
- 50.9% of survey respondents don’t have a documented social media strategy. A unified brand voice is so important (more on that under “Reputation as a Ranking Factor” below) and should be prioritized. According to this survey, slightly more than half do not have this plan in place.
- One stat that is omitted is how many respondents use a social media monitoring tool (in addition to management tools). My guess is not many based on my experience working with big brands. Unfortunately, most brands don’t know how invaluable they can be to a social media marketing strategy – especially when it comes to managing your social media reputation online.
Insights into the Google Algorithm
Google January Update – “Newsgate”
There was a minor Google algorithm update in January, but it seems to be targeting news and blog sites that rewrote or scraped content from other sites. Especially if those sites rewrote the content without adding additional value. You can read more about the chatter here.
Reputation as a Ranking Factor?
I think this article on Search Engine Land about how Google could create a model to use social media signals (and other signals) to develop a sense of “high reputation” websites and then use that as a ranking factor is a must read. I’ve been saying for years that social media signals (while not a direct ranking signal) correlate to higher search engine rankings. We’ve always been able to point to concepts like brand recognition, personalized user searches and word of mouth as ways that social media can impact SEO but this takes it to a whole new level.
The correlation is why I still help some clients with social media – developing a consistent brand voice, aiding their social media presence, and setting up social media promotion and customer service teams.
Your brand experience on social media needs to match your other marketing channels, needs to be consistent and needs to be coordinated. Most brands are not as coordinated as you would expect, and most are not leveraging social media promotion (and the reputation that could be built using social media) as much as they could.
Biases in Machine Learning
A must listen to TED Talk about how to fight algorithmic bias in machine learning:
This is especially worth a watch for anyone in search marketing. As mentioned by Britney Muller Moz, the sample set in machine learning really impacts the results of the algorithm.
In related news, there’s this Wired article about an image recognition machine learning software which was taught a sexist view of women. In fact, they were not only trained to mirror sexist biases but also amplified them.
As sophisticated machine-learning programs proliferate, such distortions matter and it’s even more important to have a government led ethics oversight function of big technology companies. Brookings has presented some recommendations here. I personally think our best bet might be to support ACLU’s work on the issue.
Google announcements
Google’s New CMS for News Sites
Google is doubling down on its support of WordPress and has announced a news friendly CMS build on WordPress called Newspack that was built for local news sites.
Though the announcement mentions that the program will favor local news sites already on WordPress if you’re a local news site that has a less than ideal WordPress installation, I might recommend applying.
Google’s URL Inspection Tool Shows More Data
Now you can get the HTTPs response, see what your site looks like on mobile, see the JavaScript logs and more. Here’s the tweet announcing the new features:
? Yay! A new feature is now available in Search Console! ?
You can now see the HTTP response, page resources, JS logs and a rendered screenshot for a crawled page right from within the Inspect URL tool! ? Go check it out and let us know what you think! ? pic.twitter.com/qihtueIbsF
— Google Webmasters (@googlewmc) January 16, 2019
Disavow Tool
John Mueller in a most recent Google Webmaster Hangout mentioned that one reason to disavow file is to prevent a manual action (on bad links) in the future should you get a manual review.
When asked if webmasters should disavow unnatural links that were made years ago, he said that if it was clear the site was no longer making unnatural links, then this would be unlikely to cause a manual action.
When asked if those links can harm the site algorithmically if they don’t get a manual action, he said:
“That can definitely be the case. It’s something where, our algorithms, when they look at it, if they say, ‘Oh, there are a bunch of really bad links here,’ then, maybe they’ll be a little bit more cautious in regards to links in general for the website. So if you clean that up, then the algorithms look at it and say, ‘Oh, that’s ok. It’s not bad.’
So most sites do not need to use the disavow tool, but if you have a set of spammy links in the past and are worried about an algorithmic penalty, (or you’ve gotten a manual penalty notice) then you should disavow.
New Tutorial on How to Use Chrome Web Dev Tools for Page Speed Improvements
Google has released a great set of videos and tutorials which walks you through how to use Chrome Web Dev tools to discover the page speed issues with your URLs and create recommendations for how to improve their performance. While written for web developers, it contains a good set of tutorials for SEOs who conduct technical audits or work on fixing technical SEO issues.
“Leaked” UX Playbooks from Google
It turns out these were given out during SMX East in October, but I’m just now seeing them. This is super useful additional data to share with clients if you’re trying to get them to modify their site for better SEO and conversion performance.
Here’s the full set of UX guides from Google:
- Finance: http://services.google.com/fh/files/events/pdf_finance_ux_playbook.pdf
- Travel: http://services.google.com/fh/files/events/pdf_travel_ux_playbook.pdf
- Real Estate: http://services.google.com/fh/files/events/pdf_realestate_ux_playbook.pdf
- News & Content: http://services.google.com/fh/files/events/pdf_news_ux_playbook.pdf
- Health Care: http://services.google.com/fh/files/events/pdf_auto_healthcare_playbook.pdf
- Lead Generation: http://services.google.com/fh/files/events/pdf_leadgen_ux_playbook.pdf
- Automotive: http://services.google.com/fh/files/events/pdf_auto_ux_playbook.pdf
Brew yourself a cup of coffee and dig in!
Indexing APIs from Google and Bing
Never before has there been a way for you to speed up the search engines’ indexing of your pages, but now both search engines have a way to nudge the search engine to crawl your new or updated pages faster (and hopefully index them faster). Or, you can also use it to tell the search engine to de-index pages you don’t want in the search index.
While Google announced their API in December and noted that it was only for livestream and job postings, a few SEOs have tried other content formats and have seen success. Currently, the API limits are 600 requests per minute and 200 publishing requests per day.
Bing also announced during SMX West that they have an indexing API as well. Here’s a tweet covering the announcement:
SEOs can now submit URLs to Bing for faster crawling. See bitly link cc @rustybrick #smx pic.twitter.com/29t9ppp6FZ
— Ginny Marvin (@GinnyMarvin) January 31, 2019
And you get to the API information for Bing here. And can read the announcement here.
Technical SEO
FREE: Weekly Monitoring of Page Speed for Select URLs
Do you need to keep an eye on the web page speed of a set of URLs weekly? There’s a free Google app (via Google sheets) that can help you with that.
Often my clients “fix” their page speed once for top pages, but depending on your developer’s activity or your business model, you might need to monitor those page to make sure that they don’t become slow over time. This free Google sheet helps you monitor whichever top URLs make sense for you and works well if you have a short list of URLs to watch.
There are also monitoring tools that will do this for you, like:
Alternatively, you can use a custom extraction in your favorite crawl tool.
Deepcrawl has instructions, and here are ScreamingFrog’s instructions for setting up their crawler to capture page speed metrics.
AMA with Gary Illyes on r/TechSEO
Check it out if you’re free on Friday, February 8th at 1 PM EST. www.reddit.com/r/TechSEO.
Tools and Resources
NPR’s Storytelling and Content Training Center (free)
Where has this been all of my life? This is hands down the most comprehensive resource to help you create better stories with your voice, video, images or writing. A must resource to check out and bookmark if you’re creating content for your own business or clients.
What is an “Impression” in Google Search Console?
While it seems straightforward, it’s 100% not. If the only thing you’re using for rank reporting is Google Search Console (which is not what I recommend, I have my clients track their rankings with an SEO ranking tool) then you really need to read this Search Engine Land article to get the 411 on how various metrics in Google Search Console are calculated.
Fair warning – you need to be fully awake and caffeinated before tackling this article. It truly isn’t as straightforward as you would think.
Seen in the Wild: A Tweet Being picked up as a Featured Snippet
Last month we saw .pdfs (and I saw clients’ .pdfs) rank as Featured Snippets. This month, tweets are getting the same treatment!
First time I've ever seen a TWEET grab a Featured Snippet.
Hate saying the name 'Kaggle', I'm from MN and naturally want to say it with a long a so bad…shut up.
Fun fact: Men have kegels too! pic.twitter.com/Pptlhf9DcH
— Britney Muller (@BritneyMuller) January 24, 2019
And, Just for Laughs:
Zendesk generated $430m in 2017. They also created a fake band to rank for people searching for 'Zendesk alternative' ?. https://t.co/B5FltI4Vs3 pic.twitter.com/02KogatsCx
— Glen Allsopp (@ViperChill) January 27, 2019
That’s it from us for the month. I hope you all have a great next month.
More to come from us about “What We Love About SEO” next month – in time for Valentine’s Day!
Leave a Reply