search newsletterSearch News You Can Use
Episode 17 - Oct 3, 2017

Once again we have had a super busy couple of weeks with a lot of search news. Google has been going crazy with algorithm changes. In this episode I’ll give some examples of sites that are being affected by the latest changes. We’ll also talk about Panda, and some good link building tips.

In this episode:


Paid members also get the following:

  • Do redirects cause a loss in PageRank?
  • Can’t get schema? This could be why.
  • What happens if the thumbnail Google uses for your site in GSC is wrong?
  • Should we be spending time auditing links?
  • Are links from widgets ok in Google’s eyes?
  • A neat trick to help make your widget and infographic links appear more natural.
  • Does it matter if your images look weird on fetch and render?
  • If Panda negatively affects your site, how long will it take to recover?
  • This kind of link is probably not worth that much.
  • New Featured Snippet Study
  • Neat tip for looking for index bloat issues
  • New (free) Tool to help you find thin content.
  • How to get a knowledge panel
  • Local SEO: How to report GMB spam
  • E-commerce News: Should you be rewriting stock product descriptions?

Note: If you are seeing the light version and you are a paid member, be sure to log in (in the sidebar on desktop or below the post on mobile) and read the full article here.

Are you subscribed to my newsletter?

Get the lite version for free:

Or Subscribe to the full version for $18/month:


Recent algorithm updates

There has been a lot of turbulence in the algorithms lately. As always, Google is continually making changes to the algorithm. Some would argue that we shouldn’t be chasing the algorithms, but I still do think it is a good idea to take note of the more significant changes that we see.
The algorithm tracker tools have all shown crazy movement in the last month:

It looks like the following dates were ones on which there were significant changes:

September 6-8, 2017

(We talked about this in our last newsletter).
This site saw a slow decline after the Fred algorithm changes in early March. We worked with them to improve on page content, internal linking and other factors. It looks like those changes have been taken into consideration starting on September 6:

This site has worked for years to improve the quality of their site. They were nicely rewarded with Fred and saw another little bump up on September 6:

September 19, 2017

We saw a few of our clients make some gains with this quality update.
This site has been working on consolidating and trimming out thin content and improving the quality of the new content that they produce. They saw a slight dip during the June 22 quality update and now are seeing a very slight improvement:

This site was hit heavily by the February and March algorithm updates. We worked on helping the site demonstrate their E-A-T (experience, authority and trust) and also improve the quality of their new content. We’re nowhere near back to previous levels, but it’s nice to see an increase:
This site was negatively affected by a quality update in May. By this point they had already begun working with us to implement site quality changes including trimming out thin content and making their product pages better than their competitors. They saw an increase starting in August and got another boost on September 19. It is important to note that these are not seasonal changes:

September 27, 2017

There seems to be a lot of chatter about ranking changes on this date, but I don’t have any examples to study as of yet.


Google announced a new option to replace First Click Free

First Click Free was a program whereby Google would allow news publishers to let Googlebot see content that was behind a paywall provided that users could see that content. Google allowed companies to give users one free view and then they could require users to register in order to see rest of the content.
If a publisher wants to gate all of their content, then this means that the content is hidden from Googlebot, and it can’t be seen on search. If publishers tried to show Googlebot this content that the general public was blocked from seeing, then this would constitute cloaking.
Cloaking means that you’re showing Google something different than what the average searcher sees.
This Search Engine Land article explains that the Wall Street Journal ended their First Click Free program earlier this year which resulted in a 45% drop in search traffic. However, it also resulted in a four fold improvement in paid subscriptions.
Just this weekend, Google announced a new solution called Flexible Sampling. There are two ways in which this can be used:
1) Metering. If you use metering, Google suggests that you allow free users to see a certain number of pieces of content per month before forcing a paywall on them.
2) Lead-In. This allows publishers to show the beginning of an article and then require payment for the rest to be seen.
Personally I think that #2 is the best option all around. I recently signed up for a news publication because I kept getting sucked in by the quality of their content and I really wanted to read their articles. If I had been allowed say 5 pieces of content for free each month I likely would not want to pay for it.
In order to implement flexible sampling, you’ll need to use the correct schema:
https://developers.google.com/search/docs/data-types/paywalled-content
However, as Barry Schwartz reported, some are finding that Google’s structured data testing tool is not able to recognize this new schema. John Mueller said that this should be fixed soon.


Affected by Malware? There shouldn’t be lasting effects on rankings

If your site has had malware issues in the past, this shouldn’t cause the site to have any sort of a negative stigma on it, assuming the problems have been cleaned up:


Not all spam reports are manually reviewed

Most of you know that it is possible to file a spam report if you see that a competitor is doing things that go against Google’s guidelines:
Have you ever filed a spam report and nothing happened? I hear about this all of the time.
I thought that Google reviewed every single spam report manually, but apparently this is not true. Instead, John Mueller said the following:
This is something where we try to figure out which spam reports have the biggest impact and those are the ones we focus on and those are the ones the web spam team would then manually review and process and perhaps take manual action on. And most of the other ones that we see tend to be things that we can collect over time and that we can use to improve our algorithms over time.”


Google does not use disavow file information to train their algorithms

This is not new news, but it is worth mentioning as a lot of SEOs believe that Google uses crowdsourced disavow information in order to train their algorithms:


If you disavow a competitor, that will not hurt them. It will simply stop any links that they have that point to you from counting.


Google ignores symbols in urls

John Mueller said in a help hangout that Google ignores symbols in urls. So, if you have content that is accessible only via a “#” in your url, Google may not see this. He said, “If the ‘#’ leads to unique content, then that would be something I recommend changing because probably we’d skip out on that unique content.”


Google’s Index Coverage report can now be exported

This feature is still in beta and only a small number of sites have access to it. One thing that I noticed when trying this out on a client’s site is that we couldn’t export the results. But now we can:


Hopefully the Index Coverage report will be available to everyone soon.


95% of changes Google makes to their algorithms are not actionable by site owners

Gary Illyes said at Brighton SEO that 95% of the changes that Google makes are things that we can’t act upon. This is a shockingly high number! I feel like what Gary is saying is that these days, when the algo changes it is because Google is getting better able to recognize quality.
So, let’s say that you have a medical site and Google has recognized that you don’t have good medical E-A-T (Experience, Authority and Trust) in order to write the medical advice that you have on your site. That’s a hard thing to fix without going to medical school or perhaps hiring a physician to write your content.
In the past, when Google made changes, in most cases we could adapt to those changes. But, more and more now, when we do traffic drop assessments, one of our main conclusions is that the site is dropping because it is not the absolute best choice for users. Those can be difficult issues to fix!


Will Google eventually mark http sites as insecure in the search results?

Jennifer Slegg asked Gary Illyes this question at Brighton SEO and the answer was no.


The Google iOs app now suggests other sites for readers to read

A lot of publishers are not going to be happy with this. Google announced this change on their blog.
Glenn Gabe has been tweeting some examples of situations where Google is doing this:


One thing that Glenn noticed is that this does not seem to happen for Amp pages. Perhaps this is a good reason to make the Amp leap?


New info from SEMRush to help you win more featured snippets

I really like this new feature on SEMRush. You can set up a project in SEMRush and then click on Position Tracking then Featured Snippets. This will give you information that looks like this:
For the site above, we can see that SEMRush has determined 79 places where this site ranks well enough in a SERP showing featured snippets to possibly win that featured snippet.
We can then take that information and try to win more featured snippets. Here is a previous post which I wrote on winning featured snippets.
Note: SEMRush doesn’t pay me to say nice things about them. I do so because I really find this tool useful. If you are interested in signing up for SEMRush, I do have an affiliate link which you can use here.


Local SEO News

More Home Service Ads appearing

Chicago is now seeing HSA’s:


Google is now pulling local news from local bloggers and other non-news sites

Google announced that they will now have stories in the local tab of Google news that are community updates. This feature is apparently only available in the US right now. I see one community update in my local tab, but it is for a news site and not a local blogger:
So how do you get featured here? Google says that they “used machine learning techniques to find additional sources publishing local content— like hyperlocal bloggers and high school newspapers”. They then reference this post on getting into Google news.
As this feature becomes more prominent in Canada, I will be testing it with some of my clients.
If you have experience in getting included in this section, I’d love for you to leave a comment.


Google’s HSA now offering in-home estimates and more


Google local finder is showing website mentions

This is an interesting find. I only seem to be able to replicate this for searches for “used cars”:
It will be interesting to see if we see more of this type of behavior in the future. At this point I don’t have any advice on how we can take advantage of this, but I feel like understanding this feature could be beneficial for local businesses.


More call extensions in organic search

Barry Schwartz wrote about the fact that Google is now showing more call extensions in organic search:
We think that you can trigger this by adding the appropriate schema to your website. Here is more information on how to do that.


Recommended Reading

Google Ranking Factors & Quality Rating Criteria in 2017
This is a monster of a post written by Shaun Anderson and inspired by Bill Swalski’s writings on Google patents. There is so much insight here.
7 on-site SEO problems that hold back e-commerce sites
This is a great article on Search Engine Land that every eCommerce site owner should read.
Chrome Will Soon Mute Autoplay Videos With Sound—Here's Why You Should Be Worried
Beginning in January of 2018, Chrome will start muting all autoplaying videos, not just those that are ads. This article brings an interesting perspective to what the implications of this move are.
The Need For Speed in the ECommerce World
This is an interesting case study showing that page speed improvements helped a site’s performance.
How long does it take to deindex low-quality or thin content published by accident? [case study]
By Glenn Gabe on Search Engine Land. Spoiler alert: In this case, it took about three months for this low quality content to get removed from the index.
Duplicate Content SEO Advice From Google
Great information on what Google has said about duplicate content. The post mentions that “duplicate content” is not mentioned in the Quality Raters Guidelines, but “copied content” is.
Apple switches from Bing to Google for Siri web search results on iOS and Spotlight on Mac
I’m already hearing anecdotal evidence of sites having a negative or positive impact because of this change.
How and Why to Do a Mobile/Desktop Parity Audit
Mandatory reading on the Moz blog.
From Inbound Links To The Rater Guidelines To AdSense, 7 Important Facts About Google Algorithm Updates As Told By Googlers
Another great post by Glenn Gabe. Here are my favourite bits:

  • The Adsense team doesn’t talk to the search team. As such, your Adsense rep is not likely qualified to advise you on search related issues.
  • If you’re impacted by a quality update, it is likely going to take a significant amount of change and a long time in order to see recovery.
  • You really, really should read the Quality Raters Guidelines.
  • When Google is determining a site’s quality, ALL indexed pages are taken into account.

Want More?

Paid members also get the following:

  • Do redirects cause a loss in PageRank?
  • Can’t get schema? This could be why.
  • What happens if the thumbnail Google uses for your site in GSC is wrong?
  • Should we be spending time auditing links?
  • Are links from widgets ok in Google’s eyes?
  • A neat trick to help make your widget and infographic links appear more natural.
  • Does it matter if your images look weird on fetch and render?
  • If Panda negatively affects your site, how long will it take to recover?
  • This kind of link is probably not worth that much.
  • New Featured Snippet Study
  • Neat tip for looking for index bloat issues
  • New (free) Tool to help you find thin content.
  • How to get a knowledge panel
  • Local SEO: How to report GMB spam
  • E-commerce News: Should you be rewriting stock product descriptions?

Note: If you are seeing the light version and you are a paid member, be sure to log in (in the sidebar on desktop or below the post on mobile) and read the full article here.

You can subscribe to Dr. Marie Haynes' newsletter by clicking on the Paypal button below. You'll get an action packed email every two weeks.

You'll also have access to past episodes, including this one.


Part of the challenge of SEO is staying on top of industry news, trends, and techniques There is so much information out there that it is easy to get bogged down in information overload and trying to disseminate what's truly important from all that noise can be really time-consuming and challenging.

Marie's newsletter is a game changer because it manages to cut through the fluff and deliver high-quality information that is not only really important for those that do SEO, but it is presented in a format that is really easy to absorb.
If you are looking for a trusted information related to search that is highly actionable I would strongly recommend Marie's newsletter.
Paul Macnamara - Offers SEO Consulting at PaulMacnamara.com

Register for Search News You Can Use

Personal Information

Invalid Coupon Code

Card Details

USD 18.00 Every Month

Register for Search News You Can Use

Personal Information

Invalid Coupon Code

Card Details

USD 18.00 Every Month