Search News You Can Use Episode 267 - The latest on the Helpful content and link spam updates, info on my new tool and more (December 30, 2022)

Remember when I said there may not be a newsletter this week because of the holiday? Haha…just kidding.

This week I have been enjoying a quiet Christmas with my family. I wasn’t sure whether I was going to take the week off. On Tuesday I thought I’d sit down to play with using ChatGPT as an assistant to help me try once again to create a tool to help people assess traffic drops. I’ve been trying to do this for 10 years, but have never come close to the dream that is in my head. 

Well, it’s working! I’ll share more on my progress in this issue. 

The latest on the helpful content and link spam updates

Both of these updates are still rolling out. 

John Mueller said we likely will not see an update for a “bit longer” because Google tries to reduce the amount of change in their systems for safety reasons.

I have heard many people speculating that the link spam update has caused them significant losses in rankings and traffic. I have not yet investigated myself yet. I want to wait until the updates are finished rolling out.

As I analyze sites that were hit, I’ll be running them through my new tool. If your traffic is down and you can’t figure out why, you can submit your site here. If I use it in my study, I’ll send you a copy of your report or a link to use my tool when it is ready.

I’ll update more in future episodes. Analyzing these updates is going to be very interesting!

Here are some recent community tweets:

It’s worth keeping in mind that if your traffic and rankings are down this week, that could just be normal seasonal change!

I updated my article on the December Helpful Content Update with everything we know so far and more of my thoughts. It’s worth a re-read even if you were not hit.

Q&A:Is engagement a ranking factor?

I love discussions like this. 

Gael published a blog post that he says was “comically unoptimized” and it quickly started ranking at #2. Not bad!

He is wondering whether user engagement could have improved these rankings. Gael had sent this article out to his email list and many had legitimately engaged with it. Does Google recognize and reward that via Chrome? Or, he wondered if perhaps this post might have ranked due to Google rewarding freshness. 

Here’s what Google’ says about their freshness ranking system:

Google's freshness systems are designed to show fresher content for queries where it would be expected.

It is possible that freshness matters here, although given this is a relatively new product that is being reviewed, I would expect that all articles would be considered fresh enough to rank. I could be wrong though. Freshness is not a system I’ve studied much.

The next question is whether Google uses data from Chrome to determine which sites people engage with. I believe they do, in a few ways.

Immediate user engagement

I’ve mentioned in the past about Rand Fishkin’s Mozcon experiment where the whole audience searched, clicked on and engaged with one particular website and sure enough, the website improved in rankings for that search.

Also, when MHC won the Wix SEO competition a few years ago, we believe part of the reason was because we manipulated user behavior! We asked our newsletter subscribers and my Twitter followers to search for “Wix SEO” (the term we were trying to rank for), and click around our site to find which pages we had hidden pictures of John Mueller on! It was a tight race, but the following day when the contest was judged, our rankings improved just enough for us to win!

Click behavior does seem to affect rankings. However, in both of the above cases, the ranking improvements reverted back within a day or two.

If Gael’s good rankings are due to increased engagement, the increase may be short lived. 

How Google may use Chrome engagement signals

I suspect Google likely does gather data from Chrome for some websites. Which content is engaging people? What is getting shared? What are people clicking on? These are all metrics that could correlate with site quality or helpfulness. The problem though is that it’s a noisy metric. If Google relied heavily on clicks and engagement as a ranking factor, people would set up click farms and fake engagement bots. People have tried it with minimal success from what I’ve heard.

What I believe Google does is use a subset of this data for training purposes. We’ve talked in the past about how machine learning algorithms work by labelling examples of good and bad pages. They can then determine what types of things are consistently correlated with high or low quality content and build algorithms to reward that type of thing for all searches similar searches.

For example, perhaps for certain queries, content that has unique images consistently gets better engagement. If that’s the case, then for future searches of that query or ones with similar intent, Google may make unique imagery a more heavily weighted ranking factor.

So why is this page ranking?

Personally, I think Gael’s page is ranking well because it’s good and helpful! This is a product review article and it ticks off a good number of the boxes in Google’s criteria for good product reviews

It:

  • evaluates the product from a user’s perspective
  • demonstrates knowledge and experience
  • explains what sets the product apart from competitors
  • discusses benefits and drawbacks based on research

That research one is important. Google says they want to reward content that provides original information, reporting, research or analysis. This content delivers.

On top of that, a quick look at the About this result information for Gael’s website shows me he’s built up some reputation (i.e. E-E-A-T signals) in the industry.

This appears to be content that has E-E-A-T and has several aspects that are consistent with what Google says they want to reward.

I wish I could spend more time analyzing this but this newsletter has got to get out! (I’ve just spent two hours fighting with WordPress. Gah!)

Discussed in this episode (for paid members):

  • Thoughts on optimizing for neural matching 
  • Discussion on Jeannie Hill’s article on the latest QRG changes, especially experience
  • Explore is seen more often now and is likely stealing clicks from you
  • The latest important information to know on AI and ChatGPT
  • A whole bunch of other information I found interesting and helpful this week on passage ranking, a recap of this year’s Google updates and more discussion on whether links have less impact for helping sites rank as in the past.

Things I found interesting this week

Thoughts on optimizing for neural matching.

This was a very interesting conversation with Kevin Indig.

He shares how if someone searches for “performance marketing”, Google knows which entities are related to that topic, like for example, “advertising”. 

Next, Kevin demonstrates how he uses Google’s NLP (Natural Language Processing) tool to assess content and determine which entities are discussed in the content. You can then compare the salience score of your content for important entities to your competitors. 

It makes sense to me that content that can be understood by NLP is more likely to rank well. 

You can play with Google’s NLP tool by inserting your own content into it. I’ve played around with using this for years and have felt there are things we can learn from looking at salience scores. But I haven’t thoroughly tested it. With that said, I will be spending more time in the future playing with this.

Kevin says that this produces similar results to many of the SEO content writing tools that often rely on tf-idf. He does not feel that Google uses tf-idf currently, but he says that optimizing content in this way tends to work well in Google’s NLP driven algorithms.

I do have to differ with Kevin on one particular point. He sys that Google did not once mention E-A-T in their new search systems guide. The helpful content system documentation links to information on creating content that the helpful content system is designed to find helpful. They give us questions to use to self assess our site such as “Is this content written by an expert or enthusiast who demonstrably knows the topic well?” or “Does your content clearly demonstrate first-hand expertise and a depth of knowledge (for example, expertise that comes from having actually used a product or service, or visiting a place)?”, questions clearly related to E-A-T. They end this document with information telling us how important E-A-T is.

E-A-T is important to Google

I won’t reopen the dated, Is E-A-T a ranking factor argument again. Is it a line in Google’s algorithms? No? It is not a single ranking factor, but is most definitely considered in every single query searched on Google.

Jeannie Hill’s thoughts on the latest QRG changes

I cannot believe I have not yet fully reviewed the changes to the QRG. Thankfully Jeannie has made a great summary for us!

Quality Raters Guidelines Focus: Content Writers Expertise

My favourite parts:

  • “The addition of “experience” underscores that content quality should also assess the extent to which the content creator has first-hand experience to support the article’s topic.”
  • Jeannie talks about how author related mentions like “content creator” or “journalist” or “author” are all over the QRG. I’m with her in thinking that Google wants to recognize people as entities whether they are well known authors or simply hobbyists sharing their craft.
  • The QRG says 6 times how important it is for your content to perform well on mobile.

I’d recommend giving this article a read!

Interesting new SERP features or tests

Is Explore stealing clicks from you?

I am seeing more and more Explore results in my Google searches. It used to be that I could occasionally trigger Explore for some searches by scrolling for several pages. This week, I am often seeing Explore take up the first several scrolls.

“Things to know” in the right sidebar

This is apparently not new (thanks Barry), but I have not seen it before. Nick caught a glimpse of a search result where a Things to Know feature was present in the right side bar.

It would not surprise me if we see more and more things appear in the sidebar. Google will likely train us to use AI powered search features alongside traditional search.

My new traffic drop assessment tool is coming!

I’ve got a new tool coming to help you assess traffic drops!

I have been trying for ten years now to build a tool to automate my process in assessing traffic drops. It turns out it’s not that easy to do! Or perhaps I just haven’t talked to the right people. 

I’ve hired and had conversations with so many developers and SEO’s to discuss this, but it’s always been an expensive side project that never got near to completion or close to being as accurate as I wanted. Every couple of years I pick up the project again and give it another go.

This week I spent a day using ChatGPT as my assistant to help me once again try and build the tool. And um…I got it to work! ChatGPT wrote some scripts for me. Some did not work, some took some tweaking, but ultimately my tool started to do what I wanted it to do. After an entire day of tweaking formulas and trying different scripts and calculations I was near tears when I realized that the tool was accurately picking up traffic drops that I would have diagnosed as Google update hits and was generating my recommended list of resources for recovery.

I’m currently testing it on sites that have been submitted to me that have seen declines with recent updates.

I am not sure how long it will take me to complete this tool. There is more to be done and I’m so excited to do it! When it is ready for the public, I will likely open it up to paid newsletter subscribers to beta test it. And then eventually I plan to make it free for everyone to use.

And wait till you see what phase two of the tool will be!!! It is very exciting!

Podcast / YouTube

Normally this would be a podcast week. I’ve been so busy with the holiday and building my tool that it did not happen this week. If you missed it, here is last week’s episode to help you assess December traffic drops. Even if you didn’t have a traffic drop, there’s a lot to be learned in this episode.

I also had fun doing two livestreams this week where I opened up the QRG and reviewed the new changes. There are a lot!

In this one I shared a bit about how I use ChatGPT as an assistant, while dissecting the new QRG changes.

The latest important info on AI and ChatGOT

The latest important info on AI and ChatGPT

Last week I shared some gloomy thoughts on how I believe that AI based chat will radically change our world…and likely very quickly. 

I am not an expert on AI chat but I am learning. I think it is important for us as advisors and the people who business owners look to for advice to understand as much as we can about how AI is changing the digital landscape.

The coming wave of disinformation

I have no doubt that AI based chat is going to completely disrupt Search as we know it. If Google does not implement it alongside search, Bing will. I have seen several examples of new search engines that are using it. Last week I shared a chrome extension that you can use to add OpenAI chat alongside your search results.

I want to tell you how making scalloped potatoes made me both excited and terrified about AI.

On Christmas day, David asked if I could make scalloped potatoes. Sure. I have made them many times before. I don’t have a favourite recipe. What I usually do is look up a base recipe and then tweak it to make it gooooood.

So, I asked ChatGPT for a recipe. 

If I messed up the recipe, the worst that happens here is we have yucky potatoes. (They were amazing by the way.) 

But what If I’m doing a search for a medical treatment, a research paper, or a new skill I’m learning and it’s slightly off? 

The concern I have over AI chat is that when it’s wrong, it’s convincingly wrong. Remember when John Mueller and Barry Schwartz argued over cheese being a ranking factor?

There is often no way to know that the answer you are getting is not completely accurate.

OpenAI discusses this in their 2021 article about WebGPT which shows that references can be inserted into the answers given.

I believe that Google is our best hope when it comes to fighting the waves of disinformation that are likely to be created with widespread use of AI chat tools.

How Google can help fight against the coming wave of disinformation?

Here’s a thought I had this week. What if has Google seen the AI disinformation age coming for several years now? This document on how Google fights disinformation makes a lot more sense now. At the time of publishing, I wondered what specific disinformation Google was referring to. I had wondered if this was in regards to keeping the internet safe in terms of bad actors trying to publish snake oil or miracle cures. That may be part of it. But now I think they saw AI chat coming.

Here are a few excerpts I found interesting.

from google's guide to fighting disinformation  

Google tells us in this document what one of their main weapons is against fighting disinformation. This was the first mention outside of the QRG of the concept of E-A-T: Expertise, Authoritativeness and Trustworthiness being approximated by Google focusing on measurable signals. E-A-T, now E-E-A-T is Google’s weapon to fight against disinformation!

It’s no coincidence that Google’s announcement about the extra E in E-E-A-T, Experience happened at the same time the world came to hear of the power of AI chat.

Google's documentation on E-E-A-T

We will be discussing Experience a LOT in the weeks to come. As I study sites hit by recent updates, I’ll be looking in great detail as to whether Google seems to be rewarding more sites demonstrating experience. Or perhaps this is something we will see with the next core update?

Funny story:

Michelle Bourbonniere shared this great tweet with me that Bill Slawski had been discussing a while back. If you’re interested in reading more about how Google can fight disinformation check out this patent that talks about how AI language models “are prone to hallucinating, and crucially they are incapable of justifying their utterances by referring to supporting documents in the corpus they were trained over.” The answer, the paper says, is in recognizing who the domain experts are.

Can AI detection tools detect ChatGPT?

This was an interesting thread:

Gael used a tool called originality.ai to check whether AI written content was detectable. This is the post I discussed in the Q&A section above. I know it is helpful content because I had already written this up for newsletter before adding the Q&A.

In Gael’s test, content that was written by Jasper and human edited was detected as 18% original and 82% AI.

Truly human written content was detected as 98% original.

He pasted a Brian Dean article into ChatGPT and asked it to write “an article about another SEO tactic in the same style”. This, despite being written by AI, was detected as 91% original.

But then he asked ChatGPT to simply, “write an article about 5 affiliate marketing tips” and it was detected as 100% AI written.

I think it’s really interesting to see these results. But, it is important to know that Google is not against AI generated content. They’re against low quality, unoriginal, uninspiring content. The thing is, most AI generated content fits that bill.

You can use AI to help you write great content. But to succeed you likely will need to add originality that doesn’t already exist on the web. One of the best ways to do this is by demonstrating your experience!

Other interesting AI news and information

 

Other interesting news and stories

This was an interesting read about passage ranking:

Google first announced passage ranking technology in October of 2020.

Their blog post describing passages says, “We’ve recently made a breakthrough in ranking and are now able to better understand the relevancy of specific passages. By understanding passages in addition to the relevancy of the overall page, we can find that needle-in-a-haystack information you’re looking for.”

I think it’s interesting that this same blog post tells us about advances Google has made in understanding subtopics: “As an example, if you search for “home exercise equipment,” we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page. We’ll start rolling this out by the end of this year.”

Google is getting better at finding good helpful content on pages, even if the pages aren’t properly SEO’d or well structured with headings.

 

 

Here’s a good recap on this year’s updates and Google news:

More discussion on whether links have less impact for helping sites rank than in the past.

Helpful for local businesses

Sterling Sky did a test where they got a bunch of people to click on reviews to see if they could possibly move those reviews up in the order they are displayed.

And…”the review did not move!”

Cancel your plans to create click farms!

If you run a WordPress site and use the stop spammers security WordPress plugin, be aware:

 

Are you noticing that your local packs are appearing lower down in the SERPS now?

A good tip from Colan!

SEO Jobs

Looking for a new SEO job? SEOjobs.com is a job board curated by real SEOs for SEOs. Take a look at five of the hottest SEO job listing this week (below) and sign up for the weekly job listing email only available at SEOjobs.com.

I’m really looking forward to seeing how the link spam and helpful content updates pan out. As I assess sites hit, I’ll be developing my tool alongside. I am SO excited!

I hope you are having a wonderful holiday! (if it is one for you.)

See you next year!

Marie

Premium Members Login

Lost your Password?