search newsletterSearch News You Can Use
Episode 28 - April 11, 2018

It looks like the crazy algorithmic turbulence that we saw in March of 2018 has slowed down a little. We did see some significant changes on March 23, but so far I haven’t noticed any big obvious updatres in April. In this episode we talk about some great content creation tips, tips on internal linking and much more. We have also given some of our best tips we learned from John Mueller’s Reddit AMA.

In this episode:



Algorithm updates

March 23, 2018 Algorithm Update

This appears to be a BIG quality update. There is some confusion about this update though as on the exact same day, SEMRush made changes to their keyword index that caused many sites to look like they were seeing drops on the 23rd of March.
When we first started analyzing this update, almost every site that we looked at showed the following on SEMRush:

However, many of the sites that showed drops on SEMRush had perfectly normal traffic patterns in Google Analytics.
In the following weeks though, we did have several companies contact us for help after experiencing true large traffic drops starting on March 23.
At this point, this looks like another broad quality update where Google got better again at recognizing true quality.
It does look like many sites that saw changes with the big update between March 7 and 9 also saw further changes on March 23:

March 28, 2018 algo update?

I’m not so sure about this one as I didn’t see any clients that saw significant changes on this day. However, Barry Schwartz noted that there was a lot of forum chatter about algo changes on March 28.


Big losses for Pinterest

Pinterest.com appears to have been strongly hit by algorithm updates in March:
Pot Pie Girl has a good article explaining the losses. It has always bothered me that Pinterest ranked so well for things and then forced me to log in before seeing results. It looks like perhaps Google is recognizing this annoyance as well.
If you rely on Pinterest traffic for referrals, you may notice drops in March as a result.
To check to see if your Pinterest referrals have dropped, go to Google Analytics → Acquisition → Referrals and click on Pinterest.
Here is one client that saw a big drop in Pinterest referrals starting late February 2018:


Google is testing infinite scroll in mobile search results

I have been seeing this for a while now, but apparently I was seeing an early test. Now, on mobile, instead of seeing “page 2”, etc. you’ll see a “more results” button like this:


Goo.gl url shortener is shutting down

Google announced that as of April 13, 2018 they are shutting down the goo.gl url shortener. The good news is that links that have already been created with goo.gl will continue to work.
If you want to use a url shortener in the future, Google recommends either using Firebase Dynamic Links or a service like bit.ly or owl.ly.


Google Partner program announces big changes

Google announced changes to the Google Partner Program to take effect this month. A company can become a Google Partner by doing the following:

  • Having someone in the company complete Adwords certification.
  • Spending $10,000 USD across 90 days in your managed Adwords accounts.
  • Meeting performance requirements for growth and client retention.

In the past, if you were a Google partner, Google may have sent you leads. However, this is ending now.
The announcement says that the following are changing:

  • No more leads are being sent.
  • You will no longer be found for people doing a Google Partner Search
  • You will no longer get insights that used to be sent by Google.

Much of the Twitter discussion on this topic is of people saying that they rarely received any leads from the program, so not much is going to be lost.


Great idea for generating content ideas for things people are actually searching for


This is talking about these boxes:
Justin Briggs explained how you can scrape Google to get a full list of these answers:


Take note though that it is against Google’s TOS to scrape their results. I wouldn’t recommend doing this on a big scale.


More on intelligent search features from Bing

We spoke in a previous newsletter about intelligent search features on Bing. Bing has now announced more changes.
Here is what you need to know.

  • These intelligent answers will now gather information from several sources. (No word yet on whether we, as site owners can do anything to become one of these sources.)
  • When Bing recognizes the answer contains an uncommon word, you can hover over that word for a definition.
  • Bing will now try to display multiple answers for “how to” questions.
  • Intelligent image search is expanding now so it does not just include shirts and handbags, but also other fashion related searches.
  • Bing is advancing their intelligent search capabilities by using Intel FGPA chips. It sounds like they are doing a lot of work to continue to improve intelligent search via machine learning.

Internal links can really help improve rankings

While some might consider this a little “braggy”, I really like tweets like this:


I have had similar results with a number of sites as well. I especially love improving internal linking for small business sites with medium to low competition. In some cases, I have found that good internal linking can boost main keyword rankings by a couple of positions.
I have spoken in past newsletters on internal linking. In general, what I like to do is work on this on a page by page basis. I’ll do the following:

  • Do keyword research to determine what words we want each page to rank for.
  • Optimize the page itself for these keywords (i.e. title tag, h tags where appropriate, words and synonyms in the copy of the text.)
  • Do a search for site:mysite.com “keyword” to find which pages Google thinks are related to the keywords.
  • On each of those pages, find the first instance of a keyword that should link internally to the page that I am optimizing and make that a followed link to that page. In some cases I need to change the wording a little bit. Also, it is important that the link comes from the main body of text and not just from breadcrumbs or in a sidebar, nav or footer.

How to quickly see the CSS for any part of a page


Can you get site search terms in the SERPS by adding schema?

A question was asked on Twitter about how to get or change sitelinks, like these:

This is not something that is controlled by schema, however. In the past, you used to be able to tell Google in GSC which sitelinks you didn’t want to appear here. But now, these choices are made algorithmically.


If you do need to change your sitelinks, the best way to try and accomplish this is to improve your internal linking and make sure that your important pages are referenced a lot on your site.
You can also get jump links to certain parts of your site by making good use of internal link anchors as well:


John Mueller’s Reddit AMA

This was a fantastic resource. Here are the important points:
Q: Many responsive designs hide certain elements, like top level navigation, for usability purposes.
A: "Just purely for usability reasons I’d always make important UI elements visible. Technically, we do understand that you sometimes have to make some compromises on mobile layouts (the important part is that it’s loaded on the page, and not loaded on interaction with the page)"
Q: We see a difference in indexed pages between the old and the new search console. Is this a bug? Or does it have another reason?
A: "The index stats in the old & new search console are compiled slightly differently. The new one focuses on patterns that we can tell you about, the old one is basically just a total list. I’d use the new one, it’s not only cooler, but also more useful."
Q: One of my clients got a domain which had a bad history.
A: "There’s no “reset button” for a domain... so a manual review wouldn’t change anything there. If there’s a lot of bad history associated with that, you either have to live with it, clean it up as much as possible, or move to a different domain…"
Q: Many SEOs recommend utilizing the noindex tag at a page level vs. blocking with disallow in robots, as supposedly robots prevents the passing of link equity since Google cannot visit the page to asses link flow, and the noindex allows Google to crawl the page.
A: “Robots.txt vs noindex or canonical: the problem with robots.txt for that is that we don’t know what the page actually shows, so we wouldn’t know to either remove it from search (noindex), or to fold it together with another one (canonical). All links to that robotted page just get stuck. For canonicalization, I’d strongly recommend not using robots.txt…"


Does code to text ratio matter for SEO?

A number of SEO auditing tools will point out pages that have low code to text ratio. John Mueller mentioned in a hangout that Google does not use code to text ratio in rankings.
However, we still do pay some attention to this when doing our site reviews. Often, pages that have a low ratio of text on a page can be pages that are thin. It’s certainly not black and white, and I wouldn’t make decisions based on this number alone, but it can be a good place to look when trying to find thin content.


Moz has a new, improved link explorer

You can read about all of the exciting changes here. Here is a summary of what is new:

  • DA now updates every 24 hours. (It used to take a couple of months.)
  • The index is now 20 times larger.
  • There are new graphs for things like link growth over time, links gained and lost and more.
  • Domain Authority and Page Authority have changed so that they correlate better with Google rankings.
  • They have added something called “link lists” that may be useful for things like broken link building, link removal efforts and tracking of link placements.

If you would like more information on the upcoming changes, Hive Digital has a great writeup.
Feedback from the industry so far is quite positive:


Important news for those who use the Adwords Keyword tool for keyword research


Tip for finding hacked content on your site

If you are seeing a sudden influx of ultra spammy links to your site, this can be a sign that your site has been hacked. Often, hacks can be sophisticated enough that you, the site owner can’t see them. This tip can help you see if perhaps Google is seeing hacks that you are not:


Are your opt-in forms being blocked by ad blockers?

I found this an interesting tweet:


Apparently this can happen with other opt-in form providers as well. It was interesting to see that Optinmonster has to keep making changes to stay ahead of the curve here:


In reality though, most opt-in popups really are ads, so it doesn’t surprise me that many ad blockers are blocking these.


Have you looked at the “discovered, currently not indexed” section of GSC?


This section of Google Search Console is found in beta GSC → Index Coverage → Open Report → Excluded:

Often this is a great place to look for thin and duplicate content. Every site is going to have some content that Google discovers but doesn’t index, but if you have a lot of content in this section, this is not good.
I can’t prove this, but I think that sites that have a large amount of “discovered, not currently indexed” content, this could be a factor that causes the Panda algorithm to demote the site.


PWAs are now on Safari


Chrome does indeed measure and use engagement metrics

I found this article on Moz quite interesting. The article was speaking about how Google uses data from opted in Chrome users to help evaluate site speed. However, to me, this revelation supports the idea that Google can receive user experience information from Chrome as well.
The Chrome Privacy Notice says that if you are logged in to Google, information on your browsing can be sent to Google:

It also says that if you are signed in, you are likely sending information about clicks and other information about pages that you visit and how you use them:

What does this mean? It might mean nothing. It may mean that Google is just using information about site speed and nothing else. But, it makes sense to me that if Google is able to track preferences, clicks and other usage statistics, then those could be used to help determine which websites people use and engage with the most.


Code in your <head></head> section can cause problems

The head of your page is everything between the <head> and </head> HTML tags. John Mueller said that some scripts can cause html to be injected into the head of the page. Sometimes when this happens, it can cause Googlebot to not see other parts in the head that follow.
For example, a script could cause content to be injected into the head and then, if your hreflang follows that content, the hreflang won’t be seen.
If you’re not sure if this is happening for you, John recommends running your page through Google’s Rich Results test and then clicking on “View Source Code”:

What you want to see is whether there are any iframes or divs appearing above important things like hreflang, canonical tags or other important head components. If so, then it is possible that Google is not seeing those hreflang or canonical tags.
To me, this is an incredibly important statement. I would love to hear your comments on this. The next time we have issues with a site that is not performing as expected, or perhaps, where Google is not honouring canonicals, this is something we will check.


Question for John Mueller on removing old posts from the index

In a recent help hangout, John Mueller was asked the following question:
I have six to seven hundred pages that were created a long time ago for SEO with massive amounts of descriptive content...The content turns out to be really poor quality and these pages drive almost no traffic. I have heard that such low quality content could harm the entire website in terms of SEO. Is this true?”
Here is John’s answer:
Yes we do sometimes look at the website overall to figure out how it fits in with the rest of the web and where it would be relevant to show, and if we can tell that a website is primarily low quality content, spun content, rewritten content from other sources, then that might be something that we take into account with regards to how we show your website in search.”
 
When asked whether those pages should be 301 redirected to other parts of the site, John said,
I would recommend either deleting these pages or just updating them. I think redirecting them to new pages that you create is kind of an unnecessary extra step. But, if you feel that you can add unique and significant compelling value to these pages, then I would just update them. Or, if you think that these pages have absolutely no sense to keep for the long run then just delete them.”
This is an issue that comes up for us a lot when we do site reviews. And often, the decision on whether to remove old unhelpful content is difficult. If old content has links pointing to it, then deleting those pages could hurt.
Years ago, I asked John Mueller about whether we need to remove old blog posts that are no longer read. He said that if the blog posts were helpful to people at the time, then there was no need to remove them now.
Updated note: It is interesting to note that we have had a couple of sites to review that saw drops with the March 9 update that we feel were demoted because of something John said above. Several of these sites had thousands of pages of content that was simply rewritten from another source. Remember that it is not enough to just have unique words on a page. Rather, your content has to also have unique value to users.
This is mentioned in Google’s Quality Raters’ Guidelines as well. There is a lot of the guide that is dedicated to helping the Quality Raters determine whether content is copied from another source:
If you are a news site that republishes stories and rewrites them in your own words, unless you are recognized as an incredibly authoritative site, you absolutely must be adding to this content in a way that would make it so that people would rather read your article than the original news source. Otherwise, your entire site can see drops in conjunction with Google Quality Updates. This is quite difficult to do.


Does adding structured data help with rankings?


A tip to help image pages rank better

John Mueller spoke about pages that are image heavy in a recent hangout. He mentioned that Google can’t interpret what images are and that in order for image heavy pages to rank, they need content (i.e. text).
He recommended adding comments to pages like this:
What I’d recommend there is to find a way to get more content onto the page. A simple way to do that could be to let people comment. User Generated Content brings some overhead in terms of maintenance & abuse, but could also be a reasonable way to get more text for your pages.”


Q&A Pages can sometimes be seen as low quality content

I found this an interesting comment by John Mueller in his reddit AMA. A site owner said that he creates Q&A content and posts each question as a separate page. John said:
I'd try to combine this into the article pages themselves. Making separate pages for each Q&A seems like you'd be generating a lot of lower-quality pages rather than improving the quality of your site overall.”
Often sites will create Q&A pages so that they can optimize each page for a specific question. It makes sense though that users would prefer to see one big page that contains every question they might have on a subject rather than just individual pages.


Google jobs displaying now in the UK


AMP News

Your AMP page can be your canonical page


However, if you don’t specify a canonical, and you have both AMP and regular mobile pages, Google will prefer the regular mobile page:


SEO Tools

This is a neat tool created by Russ Jones to help you determine whether you have links on your desktop version of your site that are missing from your mobile version. If this is the case, your site could have issues with mobile first indexing.


Local SEO

Improvements to Q&A


New GMB descriptions are not used in rankings

We reported in our last episode that you can now write a description again in your GMB profile. However, don’t bother keyword stuffing it! It’s not used for rankings.


You won’t be notified for some GMB issues

Google mentioned in a help forum thread that they are now marking some issues in GMB as “secondary”. They won’t send notifications for secondary issues. The example that they gave is that if someone makes a change for a non-sensitive attribute such as “serves breakfast” then the business owner won’t be notified.
This has me a little bit worried. I’m not sure that I would trust Google completely to notify me when important changes have been suggested for my business. If you’re doing local SEO I would recommend regularly checking all aspects of your GMB profile to look for unknown edits.


GMB has had some image issues lately

There have been several issues in the last couple of weeks in which images are not displaying properly for many businesses.


Google post offers?

This looks interesting. It looks like it is just a test so far:


Can you sue a business for a negative review?


New: “Services” section in GMB


Google announced this in the GMB help forum, saying the following:
“Back in January we launched a new Menu editor for the food service industry. This month, we are excited to announce that we have expanded our menu editor to now include additional services.
Businesses in health & beauty, and service businesses, such as plumbers and florists, now have the ability to add their menu of services directly to their listing through their Google My Business account.
Same as the food establishment menu editor, this feature will only be available if the listing is not currently connected to a third party provider and for listings in English speaking locals. “


SEO Jobs

Are you a Magento developer? This looks like a good gig:


Working with Leslie To is a great opportunity as well:


Here is another opportunity if you are near Dallas:


Marie Speaking at Engage Portland - Is it a good link?

This was a fun talk. I gave an interactive quiz where we looked at different types of links and determined whether each would be considered a good one.


Recommended Reading (All)

SEO Ranking factors panel: SMX West session recap
https://searchengineland.com/seo-ranking-factors-panel-smx-west-session-recap-294517
This is a recap by Eric Enge of the SEO Ranking Factors session at the recent SMX West. This panel focused on how Google’s ranking factors evolve with each new update, and how machine learning may be influencing this - so, basically, what people like to see. By the sounds of it, this was a really interesting discussion and I highly suggest that you check out Eric’s recap.
Here are what the three panelists discussed:

  • Olga Andrienko from SEMRush shared the results of a recent study they did which looked at the top 100 positions for over half a million keywords. They then used a model called Random Forest to build decision trees to look at what the data shows for several on-page potential ranking factors.
  • Marcus Tober of Searchmetrics discussed why ranking studies can be harmful if you believe the conclusions blindly, especially since the conclusion can change depending on the industry.
  • Chanelle Harbin, from Disney/ABC Television Group, discussed several real-world case studies including the results of implementing schema on certain video pages, as well as recipe pages. She also discussed what kind of link building they undertook and how successful it was

The slide decks for all three panelists are available there for download too, if you wanted even more information.

Personas Make Your Marketing Stronger
https://kickpoint.ca/why-use-personas/
This is an article by Dana DiTomaso about why creating articles for a specific target audience has been such a successful marketing strategy. She stresses that any persona you create “must be based on fact, not fiction”, meaning you should draw on real data to find an audience segment first, and then narrow it down to form your persona. This helps you to specifically target your persona, and measure your impact. Dana also says that empathy is what makes a persona great. It’s not enough to just collect demographics, you need to find what your persona empathizes with in order to be able to talk directly with them and market yourselves in the way that works the best for them.
Dana goes on to give a ton of information which explains in detail how to create an effective persona, including how to find out what kind of marketing they would respond best to. I definitely recommend checking this out if your business does any kind of marketing!
How to Create the Perfect Meta Description for SEO
https://www.reliablesoft.net/meta-description/
Alex Chris Wrote an informative article here. Some important points to keep in mind for crafting a great SEO Optimized meta description are:

  • Meta descriptions are not directly used for rankings.
  • Having interesting descriptions will increase Click Through Rates.
  • The ideal length should be between 160 to 320 characters.
  • Try and approach writing the same way you would when writing an advertisement.
  • Be sure to test how the descriptions look on all devices since the way Google renders results can vary.

Google bug bounty for security exploit that influences search results
http://www.tomanthony.co.uk/blog/google-xml-sitemap-auth-bypass-black-hat-seo-bug-bounty/
Tom Anthony has a really interesting piece he has written about a bug he uncovered in Google. Essentially, Google allowed you to submit a sitemap for a property that you don’t actually own, and if it was done though the special “ping” URL method, no record of this submission showed up in that property’s Search Console. You could submit a URL redirect this way, Google would still follow it even if it lead to a different domain. Tom took this a step further and found that if he submitted a sitemap in this method that contained hreflang entries pointing to a different domain, then people would start automatically being redirected to that different domain based on their location.
There were huge blackhat implications for this, and in leveraging a competitors link equity and passing it off as your own. Thankfully, Tom filed this bug report with Google and this bug has now been fixed.
How the Increased Meta Description Tag Length Affects Your Strategy by Jim Bader
http://www.verticalmeasures.com/search-optimization/how-increased-meta-description-tag-length-affects-your-strategy/
Last December Google changed the length of meta descriptions from 170 to 320 and since then they have been experimenting with bolding of semantic keywords and even replacing your description with content from the page. It’s not always clear why a meta description will or won’t be used, as Jim points out when comparing car insurance listings. Although he still recommends optimizing the best you can by doing keyword research through Google Search Console and monitoring your competitors in the SERPs as well as testing. While experimenting and testing keep a few things in mind; include semantic language, write like marketing tool not an SEO tool, use unique descriptions per page, and try different calls to action.
5 practical tips to prepare for mobile-first indexing By Dawn Anderson
https://www.smartinsights.com/search-engine-optimisation-seo/mobile-seo/5-tips-to-prepare-for-mobile-first-indexing/
1.Switch to a responsive site design as soon as possible.

  • Having both a desktop and mobile version of a site can often times feel like managing 2 separate sites and this can lead to issue with consistency across pages.
  1. Don’t jump the gun.
  • Attaining a “mobile friendly” status in Google’s eyes doesn’t necessarily make it user friendly. It’s better to have fully functioning desktop site vs releasing a substandard mobile site just for indexing purposes.
  1. Everything points to putting UX front and centre of priorities.
  • Google has said in the past that content behind tabs hold less weight but this will be less of a factor for mobile first indexing going forward. It’s still important to keep in mind that if content is show “on click”, crawler will not be able to access it if shown through javascript actions.
  1. Think about ‘information overload’ in an over-connected mobile-first world.
  • In a mobile first world having content that is concise yet still informative is going dominate. Make use of executive summaries, named anchors and TDLR’s as to meet users informational needs quickly.
  1. Flash is a total no-no.
  • Completely avoid serving content through flash as it is likely incompatible with mobile devices. Many of the features of flash can now be emulated through markups and coding technologies also.

Recommended Reading (Local SEO)

The “Business Description” Returns to Google My Business: What You Need To Know By Colan Nielsen
https://www.sterlingsky.ca/the-business-description-returns-to-google-my-business-what-you-need-to-know/
For the last 2+ years, Google had removed the ability to edit your business description from your GMB listing. They have now re-introduced this function with updated guidelines which include restrictions on including links of any kind, providing misleading content and displaying offensive material among others. Some things to keep in mind for your business description; multi-location brands can use the same description across listings and it is ok to include email address within the listing.
The Guide to Local Sponsorship Marketing - The 2018 Edition by Claudia Cruz
https://moz.com/blog/guide-to-local-sponsorship-marketing
In this local SEO guide, Claudia updates a process for gaining great exposure in a channel often overlooked, sponsorships. Unlike other ways for a business to gain exposure, local sponsorships aren’t accessible from a single platform so they can be harder to implement but they can provide unique benefits if done correctly. Some of the benefits include website links, mentions in public, guest posts, media attention, speaking opportunities and much more. This article walks you through a simple process for outreach including initial contact, evaluating responses and tracking the fulfillment of the campaigns. This is a great guide and a must read for local SEO’s that are looking to identify major sponsorship opportunities for local businesses of any size.

How to Manually Flag Reviews on Google, Yelp and Facebook
https://localu.org/blog/manually-flag-reviews-google-yelp-facebook/
Here is another really informative Local U article, this time written by Markella Haynes from MDG Advertising. It is a super comprehensive guide to manually flagging a review for removal and includes site-specific steps for Google, Yelp, and Facebook. Anyone who has a business knows there is always the potential to receive fake negative reviews on these platforms and although there are algorithms in place to prevent things like this, they aren’t perfect. As a small business owner, you are responsible for knowing the policy guidelines for these platforms, and it’s sometimes necessary to manually flag reviews for removal when they violate them and are missed by the algorithms. I definitely recommend bookmarking Markella’s guide to keep as a resource for when that happens.
How Should You Ask for Online Reviews? The Pros and Cons of Each Approach
http://www.localvisibilitysystem.com/2018/04/05/how-should-you-ask-for-online-reviews-the-pros-and-cons-of-each-approach/
Speaking of reviews, Phil Rozek has put together a list of all the different ways business owners can use to ask their customers to leave reviews. If you are looking for ideas on how to encourage people, or maybe just hoping to find something new you haven’t tried yet, I recommend giving Phil’s article a read.


That's it for this episode! It was a big one! Stay tuned for our Youtube video (my channel is here. If you want to follow me on Facebook, here is my page.