search newsletterSearch News You Can Use

Episode 3 - February 24, 2017

This is a packed issue of Search News You Can Use. As I write this, I am at SFIMA Pubcon in Fort Lauderdale. I've added a few tips to the newsletter that were part of the great content here at the conference. Hope you enjoy!

In this issue

News

Tips

 

News


 

Possible algorithm update

Who is this important for?
Everyone
 
Summary
Google is continually updating their algorithms, but every now and then there is a bigger update that seems to affect a lot of people. It looks like there was a big update of some kind on February 7th. There is no official name for this update. Like most algorithm updates it seems to be related to overall site quality.
 
Glenn Gabe wrote a good post on his findings and he made an interesting observation. He felt that this update was similar to Panda and focused on site quality. Also, Barry Schwartz has written that this could be a Phantom/Quality update. I found something interesting when I looked at my data and noticed that a couple of my past clients who saw increases were clients that benefitted from Penguin previously.
Feb 7, 2017 Google updateThis doesn’t necessarily mean that the February 7th update was all about link quality though. Most sites that I worked on to help remove Penguin issues were also actively working on site quality.
 
If your Google organic traffic dropped on February 7th I’d recommend that you have a thorough read of Glenn’s post. If things don’t improve, it might be a good idea to have a site quality audit done.
 
Note: Be sure you are looking at just Google organic traffic. Here is a tutorial that I wrote that shows you how to look at just Google organic traffic. If you see a drop that happens on all search engines then it’s not likely that you were affected by this Google algorithm update.


 

Gary Illyes Tweets about robots.txt and noindex tags

Who is this important for
Anyone who has pages that they want removed from Google’s index
Summary
Gary Illyes from Google tweeted a bunch of “Did you know?” tweets over the last couple of weeks. Barry Schwartz has listed them all here. I’m going to talk about a few of these tweets which we can use.
 


 
 
Let’s say you find that you have a bunch of pages in Google’s index that shouldn’t be there. Perhaps you have tens of thousands of pages from your /search/ directory that serve no purpose being in Google’s index. I’ve commonly seen people make the error of simultaneously blocking the /search directory in their robots.txt file as well as adding a noindex tag to that page. If you do this, then the robots.txt block will make it so that Google doesn’t crawl the page and they won’t see the noindex tag on the page.
 
Recommendations
If you want to remove pages from Google’s index, here is what I recommend:
 

  1. Add a meta name=”robots” content=”noindex, follow” tag to the page.
  2. In Google Search Console, use the url removal tool to ask Google to remove this directory.
  3. Each day, do a search to see if these pages have been removed from the index. You can do this by searching site:mysite.com/search/
  4. Once the pages are completely gone, at this point add a line in your robots.txt file to keep search engines from crawling those pages. (User-agent: * Disallow: /search/)

 
Why bother blocking the pages by robots.txt if they have a noindex tag on them? If you don’t, then even with the noindex tag, Google will still spend time crawling these pages. This can eat up your crawl budget and dilute the overall quality of your site.


Gary Illyes tweets about duplicate content causing crawl budget problems

Who is this important for
Large sites, especially large eCommerce sites
 
Summary
Gary Illyes tweeted the following:
 


There is no duplicate content penalty, but if Googlebot spends all of its time crawling pages that are near duplicates of each other, that can “dilute the signals” that you are sending to Google and they may possibly view your site as lower quality than it is.
 
Tip
If you have products on your site with urls with many parameters, such as size, color, etc., then try searching Google to make sure that you don’t have a duplicate content problem. Here is one way. You can search the following:
 
Site:mysite.com inurl:size inurl:color   (assuming that you have urls that have size and color in them)
 
Let’s say that pages are coming up that look like this:
 
mysite.com/clothing/blue-shirt/?size=l?color=blue
 
Now do this search:
 
site:mysite.com/clothing/blue-shirt/
 
You may find that you have many varieties of urls that are all essentially the same page. If this is the case, there is information here on proper use of canonical tags in order to stop this issue. If you find this too complicated you can contact me and I can put you in touch with a good SEO to help you fix this.


Gary Illyes tweets about keywords and their position on the page

Who is this important for
Everyone
 
Summary
Gary Illyes tweeted the following:
 


 
Google is pretty good at figuring out which content on your page is the important content. We all know that it is important to contain our keywords in our content. But, it is super important to make sure that our keywords are visible in our main content. In other words, if your important keywords are in a sidebar or in the footer then they may not carry as much weight.
I personally think that it is important to have your main keywords above the fold in your content although I haven’t done testing to improve this.
 
I think it’s quite interesting that Gary mentions that a keywords surroundings may be important. I do think that just randomly placing keywords on a page is much less effective than having surrounding content that is related to those keywords.
 
Let me give an example. Let’s say that I was trying to rank a site for “best house cleaner in Toronto” and I had the following two options:
 

Best house cleaner in Toronto

[image]

We are located in Toronto and are the best house cleaners in Toronto. You can call us today at 416-555-5555

 
Or, perhaps I can look at the related searches at the bottom of a Google search for “house cleaner” and incorporate some of these in my above the fold text:
related-searchers-cleaners
Here is content I created that incorporates some of those words:

Best house cleaner in Toronto.

Are you looking for cleaning lady services? We offer one time cleaning, commercial cleaning and apartment cleaning. Deep cleaning is our specialty. Our maids are second to none. If you are looking for house cleaning services near you, we would love to discuss our rates and options. Call us today at 416-555-5555.

[image]

 
Challenge
I challenge you to take a few of your pages and run them through fetch and render as google bot. To do so, go to Google Search Console → Crawl → Fetch as Google. When you look at the rendered page as seen by Googlebot, what content is visible above the fold? Does it contain your main keywords? Are those keywords surrounded by related words?
 
If not, make some changes. If you see improvements, report back to me as I’d love to hear what happens. I recently did this on a client’s local business site and moved the client up three positions from #9 to #6 for one of their terms. Now, this is just one case and it’s possible that a combination of factors was responsible, which is why I’d like to hear how things work out for you.


Google releases a video on how to hire an SEO

Who is this important for
Everyone
 
Everyone who is either thinking of hiring an SEO or who does SEO themselves should watch this video. It is really good. If you don’t have time to watch it, I’ve summarized the important points below:

  • SEO success is only as good as the quality of your website.
  • A good SEO looks to improve the entire searcher experience.
  • A good SEO will make basic changes like title tag changes or more detailed things like language markup for multilingual sites.
  • A good SEO will help to make sure your site is as helpful as possible to people who are coming to your site.
  • In most cases it will take 4 months to a year from the time you start making changes to the time when rankings improve.
  • When an SEO makes a recommendation they should back it up with a documented statement from Google that corroborates that recommendation.
  • Things like adding keywords to the meta keyword tag and buying links are not recommended by Google.
  • When hiring an SEO conduct an interview with them, check references and also ask for (and pay for) a technical site audit.
  • In the audit, the SEO should prioritize each issue with the issue, the suggested improvement, an estimate on the overall investment, the estimated positive impact, a plan on how to improve or move on should the changes fail to help.
  • Duplicate content issues should be cleaned up for long term health rather than intial growth. It’s often not a pressing problem.
  • A search audit looks at branded and unbranded terms. An SEO will make sure that for branded queries, your customers are able to find you and have a good experience. For unbranded queries, an SEO can help you understand the competitive landscape. Can you beat the competition?
  • Ways an SEO can help improve rankings is by updating obsolete content, improve navigation or improve internal linking and also to generate buzz to garner natural links.
  • Don’t hire an SEO if you are not prepared for making the recommended changes.

A note from Marie: If you are looking at hiring an SEO, I would love to help connect you with someone who does good work.


Creating a Contest to get links? Not a good idea.

Who is this important for?
Anyone who is thinking that they could build links by creating a contest.
Summary
John Muller said in a hangout this week that you need to be really careful if you are thinking of trying to build links by having a contest as these links could be in danger of going against Google's Quality Guidelines. I wanted to explain what he means because not all of these links would be unnatural links.
John is talking about a situation where you have a contest and one of the rules of entry is that you require people to link back to your site. If you are requiring a link for entry purposes then this is a link scheme in Google's eyes. However, let's say I decided to run a contest on my site and people were writing about that contest and decided on their own to link. This would be a good link. The difference is that in the first example, the link was a requirement for entry. It is always a great thing if you can create something that creates buzz and gets people talking and linking to it.


Reminder: Virtual Offices are not allowed for Small Businesses trying to rank locally

Joy Hawkins pointed out this great thread discussing virtual offices for small businesses. An example of a virtual office could be if you have a service business that is on the outskirts of a major city and purchase a p.o. box that is in the city center so that you can rank well within that city. It is important to note that this is NOT within Google’s guidelines. Here is a quote from the thread:
 

“Per the Google My Business Guidelines, virtual offices are not allowed unless staffed during the business hours.”


Aleyda Solis has produced a great presentation on what we need to know about the mobile first index. Regarding the mobile first index launch, Gary Illyes said that Google still doesn’t have a launch date.
 


Tips

What is your site architecture like on mobile?

I keep talking a lot about the mobile first indexing. This week I attended the Pubcon SFIMA conference. Eric Enge from Stone Temple Consulting gave a great example of an area where sites might have a problem. With mobile first indexing, Google will be crawling your mobile site in order to gather data. In the example Eric gave, a site’s most recent articles were five clicks deep on a desktop crawl and 247 clicks deep on mobile.
 
This means that if Google were to crawl the mobile site to gather information, there is very little chance that the new articles would get found.
 
This point resonated with me because I realized that one of my own sites is missing some of the architecture on mobile that it has on desktop. I would urge you to look at the mobile version of your site and determine whether Google will still be able to crawl your site as well as it can on desktop. Did you remove links to try to make the site look better on mobile? It might also be worthwhile to crawl the site using ScreamingFrog and use a mobile user agent. In fact, I plan to be testing this soon on a few sites and seeing if I can notice significant differences between crawling the site as a desktop crawler or mobile crawler.


Tip: Quick wins with GSC Search Analytics data

At the SFIMA Pubcon conference this week, Bill Hunt gave a great tip on a way to use data in Google Search Console Search Analytics. The idea is to find pages on your site that rank well but are getting a poor clickthrough rate and then figure out why.
 
To do this, go to Google Search Console Search TrafficSearch Analytics.
 
While on Queries, click Impressions, CTR (Click Through Rate) and Position.
 
Now, sort the table by impressions and look for queries for which you rank in the top three but have a low CTR. Bill suggested that <5% is a good number but I think really, if you rank #1 and are getting less than 20% CTR it’s probably good to look into why this is happening.
 
For example, let’s say that I run a site selling shoes. And let’s say I rank #2 for a query like, “should I buy leather shoes or suede?”, and let’s say that this query gets lots of impressions but almost no clicks. It might have something to do with your meta description that Google is showing. Perhaps the meta description implies that your page is only about selling leather shoes and doesn’t mention suede shoes. A simple change in the meta description to let the user know that this page talks about both leather and suede shoes might cause an increase in CTR.
 
On a personal note, I found that in a lot of cases I was ranking #1 and had a low CTR and this was because my site had a featured snippet and that featured snippet fully answered the question. In other words, there was no reason for people to click to my site. In the future, I may do some experiments to change the wordings of what Google pulls for featured snippets to see if I can make it so that the featured snippet implies there is a lot more information to be found in the article itself. I’ll keep you updated on this.


Tip: “People Also Ask” is awesome for keyword research

Google is starting to really push “People also ask” boxes. These are a great way for us to determine what content we could add to our site to make it the most thorough resource possible.
 
For example, here is what I see when I Google “are tomatoes bad for dogs”:
People also askIf I had an article on this topic, then I would want to make sure that all of those questions are answered in my article, or if there is enough info, I might right a whole new article. Also, if you click on these results, you’ll get even more ideas:
People also ask expandedWoh. Now I have new ideas for new articles: “What fruit is good for dogs” and “Can a dog eat celery” because apparently these are questions that people ask. Clicking on any of those results will surface even more queries for me to either add to my article or use to create new content.
 
If you are at all stuck for content ideas, this is a great way to find out what questions people are searching for.


Tip: Make sure your WHOIS info is correct and up to date

This was a scary article that was tweeted about by John Doherty this week. The article talked about a situation where a site was taken offline because of incorrect whois data. I would urge you to check the whois data for your site and make sure that it is correct. Also, make sure that you still have access to the email address in your whois data. ICANN is trying to contact you to tell you they’ve suspended your site for improper whois data. If you’ve used an email address that you no longer use, then you won’t get notified of this suspension.
That's it for this week! I'd love to hear your feedback. Feel free to leave a comment below or contact me here.