Search News You Can Use
Episode 6 - April 6, 2017
This is another packed episode. We'll talk about some possible algorithm updates over the last two weeks as well as the latest on the "Fred" Algorithm update. I'll share with you two things that you may be doing that could cause you problems with Google's mobile interstitial algorithm as well as some cool keyword research tips. There are also really good things in here for people with small businesses.
In this episode:
- Were there significant algorithm updates on March 30th and April 4?
- The latest on "Fred"
- Store location popups could be seen as an interstitial
- Chat Bots can be seen as interstitials and could cause harm
- Google ignores priority in sitemaps
- Are you linking to all of your pages?
- Google is now showing Knowledge graph info in the main search results for local businesses
- Some information on the keyword stuffing algorithm
- How to get new sites/pages indexed INSTANTLY
- Tip: This is a great way to find content to write about
- Recommended Reading
Were there significant algorithm updates on March 30th and April 4?
Who is this for?
Barry Schwartz reported that all of the rank trackers like MozCast, Algoroo, etc. noticed a huge amount of turbulence around March 30, 2017. However, there really hasn't been much talk on Twitter or in search forums about sites seeing major changes.
We think that the rank trackers were all noticing turbulence because Google increased the number of Knowledge Panels and Related Questions.
Seeing big movements in Knowledge Panels and Related Questions over the past 2-3 days -- https://t.co/zezk9DD2bT
— Dr. Pete Meyers (@dr_pete) March 31, 2017
Also, it looks like something significant happened on April 4th as well. Glen Gabe tweeted today that he is seeing some sites that were affected by Fred on March 7 that seemed to have a mild reversal on April 4th:
OK, 2 sites impacted by the 3/7 update (Fred) saw opposite movement on 4/4. GA Data. Could be an adjustment based on 3/7. Stay tuned. #seo
— Glenn Gabe (@glenngabe) April 7, 2017
I didn't notice anything obvious with the sites that I monitor.
The latest on "Fred"
I wrote a lengthy writeup last episode on what we knew about Fred so far. On March 7, we saw a lot of low quality sites that saw massive drops in rankings. There was a lot of discussion in the black hat forums about sites being hit.
This week, Gary Illyes commented that when SEOs were talking about a big algorithm change on March 7, there actually were several updates happening at the same time.
@kimpittoors From my perspective every update is a Fred, but there were a number of updates around the date @rustybrick declared Fred a single thing.
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) April 1, 2017
@kimpittoors @rustybrick Oh they do.
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) April 1, 2017
This is not surprising as Google has said recently that they often release two or three changes to the algorithm in one day.
It has been two weeks since I wrote my thoughts on Fred and I have seen a number of sites that were of decent quality but still saw a mild drop that starts in early March.
Some blackhats are theorizing that Google made tweaks to Penguin at the same time:
"Fred," or whatever the 1/2 updates they rolled out (one being Penguin refresh) is the most random algorithm ever rolled out by Google.
— SEOwner (@tehseowner) March 21, 2017
I didn't personally see any obvious changes in sites that had been previously affected by Penguin. However, many of the blackhats are saying that they feel that there was an update in early March in Google's ability to devalue many Private Blog Network links.
Now that we are past the days of definitive Panda and Penguin updates, it probably doesn't matter to most of us what Fred is all about as the answer for recovery is almost always to find ways to improve the quality of your site. Still, if you were hit in early March, you are probably looking for as much information as you can get on what types of things need to be improved upon.
If you are trying to rank a low quality site based on PBN's, then I don't have much advice for you. But, if you were a legitimate business that saw a drop in early March, then I really would focus on doing all that you can to make your site the best possible option for Google to show searchers.
Store location popups could be seen as an interstitial
Who is this for?
Businesses who use a popup as soon as someone enters their mobile site which helps them determine which store location they want to access.
This was an interesting tweet. Someone asked whether a store locator popup could be seen as an interstitial by Google. An interstitial is something that Google views as an ad and it can cause your site to be algorithmically demoted in mobile results. John Mueller said that a popup like this could trigger the interstitial algorithm.
— Charles-Olivier Roy (@charleoroy) March 28, 2017
@charleoroy @methode Seems like something we'd see as an interstitial (or at least index instead of the other content behind it.)
— John ☆.o(≧▽≦)o.☆ (@JohnMu) March 28, 2017
If you want to use a popup like this to help customers choose the right store location, don't display it as soon as the users enter your site. You can use popups like this without triggering the interstitial algorithm by either waiting until the user has been on the site for a while, or by displaying a popup that doesn't cover the majority of the page.
Chat Bots can be seen as interstitials and could cause harm
Who is this for
Websites that use a chat bot that takes up the entire view on mobile.
A chat bot can be a good way to engage users and get more leads. In the reviews that I do, I see a lot of sites that have a little chat box in the corner of their website that encourages visitors to the website to ask a question. If you use one of these, however, be sure that it is not annoying to users.
John Mueller was recently asked about chat boxes and said that if they cover the entire page on mobile, that they could be considered "obnoxious" to users.
On mobile, if a chatbox pops up as soon as the user enters the site, and covers the whole page, then this could possibly trigger the mobile interstitial penalty.
On desktop, there is currently no known penalty for having a chat box that immediately obscures the page on which users are landing. However, users do not like to have to immediately close a popup. I do believe that an obtrusive chat box popup can cause a lower rate of user engagement and can possibly cause one of Google's quality algorithms to treat your site as low quality.
I think that it is ok to have a chat box provided that it does not hinder a visitor from reading the content that they came to your site to see.
Google ignores priority in sitemaps
Who is this for
In an XML sitemap, you have the option of adding a priority for each url. The priority would be a number between 0 and 1. A higher number indicates a more important page on your site. Default priority is 0.5. A page that is really important could be given a priority of 1.
It used to be believed widely that giving a page a higher priority could increase the chances of it being indexed and possibly help it rank better. However, Google's Gary Illyes recently said that Google ignores sitemap priority completely.
@denstopa we ignore those. It's essentially a bag of noise
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) March 28, 2017
Are you linking to all of your pages?
John Mueller said this week that if a page is not linked to, Google will not find it.
@collindaviss If there are no links, we won't find the URL, robotted or not.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) March 28, 2017
He also said that having a url in a sitemap will not guarantee that the page is indexed.
This is probably common sense, but if you have a page that you are trying to get in the index, you need to make sure it is linked to from somewhere.
On the same note, if you have a page that is an important page, make sure that you have a good number of internal links pointing to that page. Also, make sure that you have links that point to that page from within the main content somewhere on your site. A link from your sidebar or your navigation will likely help that page to get discovered, but links in those places are not counted as strongly as links from within the main content of a page.
I have seen several situations where we have increased the ranking of a page by placing internal links from several pages on a site to the page for which you want to see improvements.
After all of this is said however, later in this episode I'll tell you about a neat situation where I was able to get some content indexed and ranked very quickly without links.
Google is now showing Knowledge graph info in the main search results for local businesses
Who is this important for?
Google is now showing some information such as opening hours, address and reviews at the top of the organic results for searches now. For example, here is a search for the opening hours of a local cafe:
If you are a business with a local presence, it is vitally important that you have the correct info in your Google My Business profile. Google does its best to determine your hours, but doesn't always get it right. To verify your business information, log in to your Google My Business Profile.
Also, we talked a few episodes ago about ways to get reviews for your business. Google loves user generated content and this includes reviews. Getting more reviews for your business is always a good thing!
Some information on the keyword stuffing algorithm
Who is this for
Google has an algorithm that is designed to recognize keyword stuffing on a website. We don't know exactly how it works. I once did an experiment where I took a page that ranked near the bottom of page 1 and then I stuffed the heck out of my main keyword. In other words, I added that keyword about a hundred times in the text. It looked awful. I hypothesized that it would drop in rankings. Instead, it went up three places.
So, obviously Google doesn't look at things simply and say, "Ah, there is xx% keyword density on this page so we will penalize it." It is likely quite a bit more complicated than that.
John Mueller said something interesting in a hangout recently:
"Focusing on keyword density is probably not a good use of your time. Focusing too much on keyword density makes it look like your content is unnatural and makes it hard for users to read and search engines generally recognize that fairly quickly and say, 'Oh...this guy is just trying to keyword stuff their pages and therefore we will just ignore this keyword completely on this website.'"
I find that quite interesting. If you have pages where you just can't seem to get any rankings for a particular keyword, it may be worthwhile to try drastically reducing the number of times it is mentioned and see if that helps.
The keyword stuffing algorithm is one that reruns every time Google crawls your pages. As such, if you are suppressed and you remove a bunch of keywords, then the next time Google crawls your page you should see an improvement. It's quite an easy thing to test.
How to get new sites/pages indexed INSTANTLY
Who is this for
Anyone who publishes new content
I am often guilty of thinking up awesome side project ideas and buying up domain names that I eventually never use. However, last weekend I decided to act on one of these impulses and created a little lead generation site for something completely unrelated to SEO. I purchased the domain name at Namecheap and set up a new profile with my host. I then set the DNS.
For those of you who are new to this, "setting the DNS" means that I told Namecheap where to go (my host) whenever someone goes to that domain. They tell you that it can take up to 48 hours for the DNS to propagate. In other words, it may take two days before my domain name actually shows my website. Amazingly, within 30 seconds of setting the DNS records, my domain name was working. That is incredible.
But there is more.
I spent a few hours creating the site. I know you want to see it...but it's a really good idea and I know that someone will steal it and do a better job than I did, so I'm not sharing. 😛
Then, I registered the site with Google Search Console. I used Google Search Console's fetch and render tool (under Crawl --> Fetch as Google) to fetch the page and make sure that Google could render it properly. And then I hit "submit to index".
If any of you have ever created a website, I know that you did what I did next and that was to immediately Google your site to see if it is indexed. In the past it would take a day or two for Google to recognize a new site and then it would take several weeks for the site to start ranking for anything meaningful.
I was shocked to see that Google indexed my new site IMMEDIATELY after I submitted it to the index. Also, the site was ranking on page 2 for a medium competitive phrase:
WHAT? That is crazy. These are not personalized results and it is a term that has competition.
So, what is happening here?
John Mueller spoke about this in a hangout recently and it is something called fast track indexing. Now, when you submit a page to the index it can get ranked extremely fast. However, the page is not likely to stay at those rankings for long.
Over the next few days I watched as my new site dropped to page 9, and then crawled up to page 6 and then appeared to disappear for a while. It's now sitting on page 3 which is probably where I would expect a brand new site for this query with no links to rank.
I'm not making any money from this site yet. 🙂 Maybe one day I'll get the time to work on getting links to it.
Tip: This is a great way to find content to write about
Dan Sure tweeted this great tip a while back:
.@BritneyMuller Want to see something cool? Find Quora questions ranking page #1 with 100+ searches with @semrush - answer those first 🙂 pic.twitter.com/y0wzgieems
— Dan Shure (@dan_shure) March 13, 2017
What he is doing is the following:
- Put Quora.com into SEMRush.
- Use the advanced search features to search for one of your keywords for which quora ranks in the top 6 and for which there is search volume of at least 100:
- Now you'll have a list of keywords for which Quora is ranking well. That means that a lot of people have this question and there is probably opportunity for you to write an in depth article on this subject.
Do we still need to disavow in the era of Penguin 4.0
This is a post I wrote for the Moz blog this week. Now that Penguin no longer demotes sites, we likely don't need to be doing as much disavowing. You should still be disavowing if you have low quality links that are obviously there for SEO reasons. The main reason for this is to avoid a manual action. The article also discusses whether or not we should be trying to reavow links - and how to do it.
Why Is This [BLEEP]er Outranking Me in the Google Maps Results?
This article goes over possible reasons why a competitor is outranking you. It's a really good read. It's meant for local SEOs but really, anyone can benefit from reading this article.
SEO in 2017: Proven Content Ideas That Attract Backlinks
By Glenn Allsopp from Gaps.com on the Smart Passive Income site. This is a great post on ideas for creating great stuff that people will actually want to link to.
SEO Agencies Receiving Emails Threatening Negative SEO, DDoS Attacks
By Bill Hartzer. This is a scary read about an email that some SEOs are getting whereby an anonymous group is trying to blackmail SEOs into paying them a fee in order to avoid a negative SEO attack. If this were to happy to me, I wouldn't worry too much about someone pointing links to my site as I could just disavow them (and really, Google's algorithms should just ignore them.) However, a DDoS attack could mean a real loss in business.
How to Get Your Emails Delivered to the Gmail Primary Tab Easily
If you do email marketing at all, this is a great read. If your subscribers are using Gmail, you really don't want your email to end up in their promotional tab. The author of this article did some experiments and came to the conclusion that you have a much better chance of avoiding the promotional tab if you avoid sending a heavy HTML email and also if you take it easy on including images, links and prices.
How To Fetch & Render (Almost) Any Site
This is a really cool post on the Screaming Frog site. They discovered that you can use Google's fetch and render as Googlebot to see how Googlebot renders any site simply by creating a page on your site that calls the page you want to render in an iframe. That's a pretty cool hack.
That's it for this week! I'd love for you to leave a comment if you have any thoughts on any of these topics.
Read Previous Episodes of Marie's Newsletter.