search newsletterSearch News You Can Use
Episode 37 - June 18, 2018

This is a big newsletter with many Google related changes discussed. We had a couple of possible small algorithm updates, and what appears to be a big wave of manual actions. There is some discussion on “Web Light”, on what to do if you get a big influx of adult links, on whether social signals can affect your SEO, and also a great tip to help you build not only links, but brand E-A-T.

In this episode:



Algorithm updates

June 8, 2018 - Likely quality update

I reported briefly on this update in the last newsletter. This looks like it was a small quality update.


I have a couple of sites which I monitor that are up slightly on this date, and one that has had a big increase. But, most sites did not see much change.

June 16, 2018 - Possible small update

Barry Schwartz has noticed extra chatter about algorithmic turbulence over the weekend. Personally, I do not see any obvious changes in sites that I monitor, but I will keep an eye on this date for you.
It is possible that this turbulence is not an algorithm change, but rather, due to a big wave of manual actions (see below).


Google sent out a bunch of unnatural links manual actions over the weekend

I have heard a lot of chatter from people saying that they have received an unnatural links penalty this weekend. I tweeted about this and am not sure whether this is a confirmation from John Mueller or just a silly tweet:


There is a discussion in BlackHat World on these penalties. Here are some quotes:

  • My site got a Manual Penalty today. All my keywords dropped to the 50-60th position from 1-3 positions!”
  • Recently I bought a package of .edu backlink [from a seller from this marketplace] and I got a manual penalty.
  • Lots of people posted that they got manual penalty since 2 days.

It is hard to say exactly what the culprit is. I suspect that Google targeted a specific company (or perhaps a few companies) that sell blackhat links. A couple of people sent me links to places where they purchased links. I’m not going to out those companies though.
Manual actions fascinate me these days as Google is really good at just ignoring a lot of links. These days, if Google gives out a manual action it’s usually because there is a type of link that they are not able to identify algorithmically (yet).


Web Light filter for Search Console

Google have released a new feature to Search Consoles performance report which allows you to filter results by how many people viewed your site via the Web Light search appearance.


Web Light appearance is a feature that Google uses to convert a website on the fly for users who are on slow internet connections. The site can be loaded over 50% fast this way. When we first saw this announcement there was a little confusion and I thought that the feature was new, however, as John Mueller clarified it is the search console reporting that is new. The feature has been around for a while.


John is right, living in Canada the likelihood of us noticing these pages being served is small. Cindy Krum has an interesting idea of how these pages are being created:


There was some discussion about whether Web Light is another example of Google stealing a site’s content and serving it as their own. At this point I’m not worried. It seems that if Google is showing Web Light content, then it means that the person reading the content would not be able to do so without Web Light as the content was too large to be served over a poor data connection.


Google Webmasters clarify details on MFI

After months of speculation across blog posts, conferences, and webinars about how Mobile First Indexing works, the Google Webmaster team have stepped in to clarify some points that they think we have been misunderstanding. See the whole thread for some great insights!


Here are the highlights:

  • With mobile first indexing, it’s the mobile version that is indexed even if it’s different from desktop.
  • You might see an increased crawl rate while Google is moving you over to MFI.
  • Cached pages will show a 404 error. This is a bug and they’re working on it.
  • The mobile speed update in July is not connect to mobile first indexing.
  • Using hamburger menus and accordions on mobile is fine.

The GSC API finally has access to 16 months of data


More coaxing from Google to encourage site owners to switch to https


Links disappearing from GSC link count

Barry Schwartz noticed that several people in the black hat forums were complaining that the GSC link count for their site was decreasing.  I have not noticed this for sites I monitor as of yet.
In the past Google has had a couple of times where there was a bug that was stopping links from being displayed in GSC. It’s hard to say whether this is a bug, whether Google is dropping quite a few links from their link profile (i.e. related to the manual actions discussed above) or something else.


Google Shopping/Adwords announce new advertising features

Keynote speaker Surojit Chatterjee, lead of product for Google Shopping, announced these new features at this weeks SMX advanced. Some of the features include new competitor price comparison tool, allowing retailers to inform their own pricing based upon others in the market and potentially bid them down. Affiliate location extensions in Search and Display have also been added, allowing advertisers on sites like youtube to direct customers to a map of nearby stores selling the advertised product when it is clicked on. These will be coupled with new local catalog product cards and additional systems to help retailers set up local inventory feeds to assist with local ads. You can read Adwords official statement about the features here.
Mike blumenthal has reported on seeing one of these features, ‘What’s in Store’, popping up already and others have been sharing their personal favorite additions to these new features,


Bing begins supporting AMP and JSON-LD

At this weeks SMX Advanced, Bings principal program manager Fabrice Canel made a couple of announcements. The first is that Bing will now be loading AMP pages in its mobile search results (much like how Google does) and secondly that JSON-LD will now be supported in Bing’s webmaster tools - allowing webmasters the capacity to debug their JSON-LD sites for Bing search.
Glenn Gabe has provided an example of what the new JSON-LD debugging tools look like:


Does Click Through Rate help with rankings?

Patrick Stox ran a poll and most SEO’s think the answer to this is yes. I agree. I think that Google can see when people are by far preferring to visit your site over your competitors.


Also, Rand Fishkin has done some cool experiments at conferences where he will get everyone to do a search, click on a result, and then magically that result will start ranking better.
When Rand first started doing this, I started seeing ads for companies offering to increase your CTR. I believe Google is pretty good at figuring out when the clicks are real and I do not recommend trying to manipulate this.
But, if you can find ways to legitimately get more people to click on your results, then this likely can truly help!


Search for future students

Google announced a new search feature on Tuesday geared towards perspective U.S. college students. When prospective students (or their parents) Google for a 4-year college program at a U.S. institution Google returns a panel with information such as average cost (after financial aid), graduation rates, adminstion rates, and student body demographic makeup.
As Glenn Gabe suggests, this new feature lends itself to a more immersive mobile SERPs experience:


The move into the ‘Education Information’ field is a new step by Google, but it is preceded by their work on the US Federal Governments College Scorecard which Google had a hand in creating last year. It also has its roots in the job search feature Google launched last year, with Google product manager Jacob Schonberg stating to EdSurge that it grew out of them asking how they could help searches find and gain better jobs. He also says that they are not trying to compete with companies like Linkedin.
This is not good news for sites that currently make money from helping people choose a college.


GDPR causing a lot of concern in the email marketing sector

As this CNBC article reports, marketing agencies are reporting massive drops in email subscribers in the wake of GDPR. As most of you will remember the new European data privacy laws, GDPR, went into effect a couple of weeks ago and the most noticeable effect of this was the flurry of email notification that privacy policies had changed for companies you were (and often didn’t realise) you were subscribed to. As a result, it looks like a large proportion of people are either not reading/accepting these new terms, or actively unsubscribing from correspondence from these companies. As a result, some marketers are reporting losses in their mailing lists of up to 80%, which in a $22.16 billion email marketing industry is a very concerning prospect for a lot of marketers/business owners.


Should we be concerned about AMP pages causing duplicate content?

Someone asked about this in a recent help hangout. John Mueller said that this is not an issue as this is what Google expects to see. Your AMP content should be extremely similar to the non-AMP equivalent.


Google will recrawl your 404 pages for a long time

John Mueller said in a hangout that when Google finds a 404 page on your site, they will still come back several times and try to recrawl that content. This is why it can take a while for crawl errors to leave GSC.
I also wanted to add that having crawl errors in GSC is not necessarily a bad thing. John has said in the past that these really should only affect your site if they are causing frustration for users.


If you get an influx of adult links pointing to your site, should you worry?

John Mueller said in a hangout that Google is really good at detecting this type of link and just ignoring it. While there is no harm in disavowing spam links like this, in most cases it really shouldn’t make a difference.
Personally, where I do want to disavow an influx of spam links is for sites that are “on the edge” in terms of link quality. If you have a history of self made SEO links that are potentially penalty worthy, then I do think it’s possible that a sudden negative SEO link attack could bring that past history into the light.
But, if you have not been heavily involved in link buying or other link schemes, I feel really comfortable just ignoring a link attack like this.


Be cautious when splitting/merging a site

In a recent help hangout, John Mueller addressed what can happen with regards to search rankings when you take part of an existing site (in this case the french language portion) and migrate it to a seperate site. What John cautions is that it is unlikely for an split or merged site to rank as well as it did. It will take some patience to let the dust settle and see where Google ends up putting the new/modified sites. Here is what he said:
“I think the main thing that you need to be aware of is that when you split a website, or when you merge a website, it’s a lot harder for us to process that compared to a normal site. So that’s something where I would go with the expectation that it’s going to take a bit of time for everything to settle down, and it’s not absolutely clear what the final state will be.”


Media Actions replace ‘Music’ and ‘Tv and Movies’ feature pages

The Media Actions feature will "enable users to initiate media content (e.g. songs, albums, movies) on content provider applications via Google Search and the Google Assistant.”. Initially the feature will only be available to Google Partners, with individual publishers needing to request separate access. This will be the first time when Google is supporting a stand-alone structured data feed of JSON-LD data instead of just ingesting schema markup from sites. You can read more about media actions here.


What does it mean when you see “ok google” as a keyword in GSC

When people start a voice search with “ok google”, Google apparently strips out that part of the query in GSC. If you’re seeing that in your keywords list, it’s likely because people thought Google didn’t hear the first time, and then repeated the “ok google” part.


However, I just checked quite a few of my clients’ sites of which I know get a lot of voice queries (because they own a lot of featured snippets). I couldn’t see a “google” related keyword on any of them.


Neat tweet about the impact of internal links


Is translated content considered duplicate content?

According to John Mueller in a hangout, no. However, John has said in the past that you do not want to use auto-translated content (i.e. content that is run through Google translate or another automated translation tool) as Google does not like this. But, if you have human translated content on your site, this is great.


Tip! Using SEMrush to determine how many subdomains a site has


New AI research from Google

Google has published a research study on a new algorithm they have been working on. The study, titled ‘Understanding Semantic Textual Similarity from Conversations’, is on an algorithm designed to better understand the nuance of short questions. Short questions are often tricky for machines to understand. For example, ‘How old are you?’ and ‘What is your age?’ are semantically very similar but structurally different. However, take ‘How old are you?’ and ‘How are you?’ which are two very similar sentences structurally but not semantically! The algorithm is designed to understand and encode the differences between these types of conversation patterns by studying Reddit conversations.
What is the potential application of this algorithm in the future? Google has been is quiet on this, but in its AI blog they suggest that these advancements will allow GoogleBot to understand much more with a significantly smaller data set than before. Roger Montti at Search Engine Journal suggests that this kind of research is a step closer towards Google's ultimate goal to create a Star Trek style computer, or in plain terms, a far more advanced Google Assistant.


AMP mobile traffic data differences by country

Alexis Sanders reports on traffic changes after switching a popular site to AMP. There was significant traffic change seen in UK/Brazil but no noticeable change in the US.


Link building (and E-A-T building) tip for brands

As soon as I saw this story below I thought was a great idea for getting links and also getting more people talking about your brand:


While doing something as massive as this might seem like extreme measures, I bet that every business could come up with something similar. I do believe that the more that you can get people talking about your brand the better. I do believe that this helps towards establishing E-A-T (Expertise, Authoritativeness and Trust).
If you’re a local business, can you think of something you can do to make your community more awesome? A way to brainstorm on this is to look at the subreddit for your city. What is everyone complaining about?
Here are some examples I thought of after looking at the subreddit for my city:

  • If construction is making travel a nightmare, you could organize a ride sharing board and encourage people to join in.
  • If there was a natural disaster, a fire, etc. is there a way you could gather people together to truly help the victims?
  • Are people complaining about the streets being full of garbage? Organize a one day cleanup event. Put a signup form on your website. Then, spread the news to the press. They’ll link to you when they share the signup form.

More image result appearing in the SERPs

STAT Search Analytics have noticed a steady incline of images appearing with results in the SERPs over the past couple of weeks


 


What are the most ignored SEO tactics?

Bill Slawski posted his Top 7 most under used SEO strategies. You can read his full list here, but a couple that we really liked were i) giving meaningful headings to all lists on your pages which should help with featured snippet wins and ii) optimising image file sizes/dimensions while also labeling them appropriately with alt-attributes.


Analysis of Nofollowed links for contributor-driven content sites

This is a really interesting study from Jonathan Greene on whether the current trend for contributed-driven content sites like Forbes and Inc Magazine to nofollow links back to the authors pages. The analysis looks at what effect this change has had upon the publishers organic traffic, and it looks like it has been a negative one (although this cannot be considered conclusive proof of course).
I  have always maintained that linking out to authoritative sites is potentially a ranking factor. This is highly speculative though as Google has said in the past that it is not. Still, I do recommend linking out with followed links from your content.


MetaFilter is struggling

This was an interesting and sad post on MetaFilter. Glenn Gabe pointed out that this site used to perform really well and has been hit hard by several Google updates:


I included this story in the newsletter as I have been following the story of MetaFilter since its massive Panda hit in November of 2012:

(Image from SEL)

MetaFilter was considered the poster child of Panda victims. From what I can see, almost all of the content on the site is user generated. The site has had issues with inaccurate YMYL advice, poor quality comments and much more. They’ve also been a source of unnatural links as they come up in a lot of our link audits.


Tip! Find out your on-site search terms via Google Analytics

This is a really neat trick courtesy of Paul Thompson. If you have a WordPress site, you can go into Google Analytics and set a search parameter for the letter ‘s’. This will return all of the search queries from the internal search function on your website. Knowing what your users are directly asking for from your website gives you some great insights for future content ideas, or if that information is already there this function lets you know you maybe need to fix something in your site navigation to make it easier for users (and GoogleBot) to find.


Do social signals affect SEO?

It is widely known in the SEO community that links from social sites (i.e. Facebook, Twitter, Instagram, etc.) likely do not help improve your PageRank. However, I do think that genuine positive social chatter about your brand can speak to your E-A-T.
Andy Drinkwater posted about a site that saw improvements after some good social activity.


Video Carousels being seen more often in the SERPs

This will either be good news for you if you post a lot of highly rated video content. In other cases this may be seen less favourably as more page 1 real estate is lost for organic results.


Top Stories feature is an Organic search feature

They are not, as many believe, a feature of Google News which runs on separate algorithms than organic search. John Mueller has confirmed this before in help hangouts and at BrightonSEO, and he recently reiterated it at the end of may stating:
“Yes, as far as I know the top stories carousel is an organic search feature. So the same kind of information with regards to language information and country targeting apply there as well. So geo targeting is either based upon the country code TLD or with a generic TLD based on Search Console settings. And the language  information you either don’t have to bother, because we can recognise most languages perfectly fine on a page. If you have different language version of the same content, we can use hreflang to help us there”


Moz Metrics are off


Local SEO

Some new local insights added to the GMB dashboard


Warnings over transitioning to the new GMB agency dashboard!

Google My Business recently launched a new agency dashboard to allow agencies that handle multiple GMB listings for companies with several locations to manage those locations all together without the need to log-in and out of each one. Sounds fantastic right?!
While the initial buzz around the release was that of excitement, very quickly the local SEO community started to report issues with migrating to the new platform.


Joy Hawkins has shared some of her most pressing frustrations with the new system


Google is now showing returns policy in the Knowledge Panel


The following thread has some great insights in it, including a couple of observations that suggest this content is being inputted manually and not through scraping websites for their terms of service pages (at least not all the time). It is also interesting to note that having a good, easily accessible returns policy is part of the quality raters guidelines for YMYL sites.


‘Near me’ is the new city search

As Ginny Marvin reports from SMX, the data suggests that the sharp rise in ‘near me’ queries has coincided with a significant decline in city specific searches. As can be seen in the thread below, this rise in ‘near me’ queries raises some new and important questions for Local SEO’s to answer, for instance,  what is the search radius for ‘near me’? Is it the same for every locale?


Google is now showing dynamically displaying business categories


Fake secondary GMB listing

According to the GMB guidelines, GMB listings must be for real-world businesses that can be visited by customers during their stated opening hours. As you can see, this local business has created two separate GMB listings and both are appearing in the Top 3 pack.


This would be perfectly fine if these were two different locations of the business (if they were two different restaurants of a fast food chain). However in this case the business owner/SEO has listed both the legitimate business address and their home address so that they can have two listings instead of one in the SERPs. This practice is both deceptive to searchers, goes against the GMB guidelines and as we reported in last weeks newsletter, can have some more serious implications beyond wasting peoples time when the listing is not for a heating service but a YMYL service such as emergency medical care.


New GMB Attribute for venues that show sports


SEO Tools

Chrome Extension to easily view page archives and cache

This looks like a handy little chrome extension that could save you some time. We are certainly going to be giving it a try!

Regular Expressions 101

Regex101.com is a great tool for checking/debugging your regex (regular expressions) in php, javascript, python and golang. Thanks to Andrew at Optimisey for the recommendation:


SEO Jobs


Recommended Reading

Link Building Strategies that Scale in 2018 - Ross Hudgens
July 8th, 2018
https://www.siegemedia.com/marketing/link-building-strategies
In this video from Siege Media, Ross Hudgens covers some legitimate link building strategies. This is all white hat advice that won't land you with a penalty and is solid advice if you or your client needs to build a valuable link profile. Some basic advice is to make sure you follow where you are being mentioned (especially if you are a big brand) and make sure those mentions are correctly linked to your site. Moreover, look for link moves. Link moves are when, say, your homepage has been linked to but actually that link-love would be far better served pointed towards a specific category page. Approach the author and ask them to redirect the link. Most sites shouldn’t have a problem doing this and it can really hone the power of your existing link profile. Ross also suggests building really strong, niche specific guides that can command a large number of links. Also, Ross details how to, and why you should, find tangential markets to write great content about that will help get you legitimate links from.
Leveraging Machine Readable Entity IDs for SEO - Mike Arnesen
June 7th, 2018
https://www.upbuild.io/blog/machine-readable-entity-ids-seo/
Although a fairly out there idea, or as Mike Arnesen calls ‘a bleeding edge’ concept, this is a really good article about how ‘Machine Readable Entity IDs’ can/could be utilised for SEO.
A machine entity ID is a string of characters used to identify single named entity and they first appeared from a company called Freebase/Metaweb which Google now owns. Single named entities are specific proper nouns like a single landmark or person opposed to a plural entity such as places or people, and the importance of identifying to eliminate ambiguity in search. For example, there are many companies with the same name, and that company name will be associated with each company more or less depending on where you inquire about the name. So to try and avoid that, and the issues it would cause of search, it becomes much easier to identify their entities by a unique ID - kind of like a digital barcode for single named entities!  Google is assigning two different types of machine readable entity IDs (MREIDs). The first is for entities that were ID’d by freebase and the others are new ones from Google, and they are using these MREIDs across all of their systems (trends, search, GMB etc.) to create (and potentially understand) the connections these entities share across the network.
Mike goes through the process of finding what your MREID is (if you have one) and how to register one if you don’t. He then walks through how to add it to your site with GTM or JSON-LD structured data, and then to implement equityLink.js on your site to add your MREID tag to all of your brand name mentions. UpBuild is currently doing this themselves as a test and are going to report back the results, but in theory this should help Google understand the different connections  your brand has across the web. We are excited to see what results are returned!
Feed the Machine: Newsweek and the race to fill Google with suicide news - Bijan Stephen
June 12th, 2018
https://www.theverge.com/2018/6/12/17448816/newsweek-anthony-bourdain-seo-headlines-suicide-contagion
Bijan Stephen from The Verge reports on a distasteful SEO tactic employed by Newsweek that has been highlighted in the wake of Kate Spade and Anthony Bourdain's suicides last week. While much of the print media ran suitably ‘sober and measured’ headlines, Newsweek took a different approach.
Newsweek capitalised on trend data that showed a spike in queries regarding Bourdain’s net worth and his ex wife, and ran clearly search optimised headlines such as “What Is Anthony Bourdain’s Net Worth? Chef, Found Dead At 61, Built Cooking Empire, But Money Wasn’t Top Concern,” and  “Who Is Anthony Bourdain’s Ex-Wife Ottavia Busia? Chef Dead At 61,”. Similar headlines were released around Spade’s death as well. The headlines have since been changed to be less brash after growing online backlash, but are still clearly optimised.
While not a pleasant example, Bijan does point out that there is a growing trend of publications producing more search-optimised content, which makes sense since their revenue is pinned to traffic levels. In essence, all Newsweek did was research what searches wanted information on and provided it to them in timely and accessible manner, a practice that as SEOs we often recommend. In this case however, some more editorial discretion to see where the line is between SEO and journalistic responsibility.
Is Bing Really Rendering & Indexing Javascript? - Dan Sharp/ScreamingFrog
June 13th, 2018
https://www.screamingfrog.co.uk/bing-javascript/
While Google has been making faster leaps towards being able to crawl, render, index and rank JavaScript pages over the past few years (in particular at this years Google I/O) Bing has been a little slow of the bat. In this article the founder of ScreamingFrog Dan Sharp investigates Bings latest claims at PubCon Vegas last year that the search engine is now able to and actively crawling and indexing JavaScript pages.
Dan looks at some initial tests run in-house and by other SEOs that suggest that Bing is still struggling to index JavaScript pages (a good example being the angular.io homepage). However, Dan points out that like Google, Bing only renders content when they think it is necessary and they have the resources. And indeed,  in the last couple of months there is some evidence that Bing is doing this more frequently. While investigating these cases, he see’s some interesting things such as Bing allowing the original HTML pages titles returned in the initial response instead of the rendered titles from the JavaScript. Dans final conclusion is that there is a mixed bag of results that could suggest Bing is making steps towards being better at rendering and indexing JavaScript, but things are a long way off at the moment.
Reducing the time it takes to write meta-descriptions for large websites - Paul Shapiro
June 13th, 2018
https://searchengineland.com/reducing-the-time-it-takes-to-write-meta-descriptions-for-large-websites-299887
While there are numerous SEO issues when working with large websites (often either related to technical problems, or problems of scale), Paul Shapiro writes about one particular issue of scale - writing vast numbers of meta descriptions.
Meta descriptions may not be the most exciting topic, but they provide a very important SEO function by allowing us to basically write our own ad content. However, with very large websites, optimising this content on thousands of pages is very resource intensive. Therefore, Paul has provided us with a step-by-step process for how to semi-automate the process using a summarization script and meta description implementation tool. While this won't write your meta-descriptions for you, it should speed up the process!
What Happens When a Werewolf Bites a Goldfish, SearchLeeds April 2018 - Hannah Smith
June 13th, 2018
https://www.slideshare.net/HannahBoBanna/what-happens-when-a-werewolf-bites-a-goldfish-searchleeds-april-2018
This slideshow has been circling a lot in SEO/marketing circle. It is from Hannah Smith's SearchLeeds presentation this year where she answers the question ‘Where do you get your ideas’ for the fantastic content she creates. Great content gets readily shared and linked to from reputable sources, all of which lead to long term good rankings. So how does she get her ideas?
Hannah aptly quotes Neil Gaiman at the start of her presentation about the frustration that can come with the question ‘where do you get ideas from’. But Neil’s advice of asking yourself simple questions and noticing the ideas that come to you all the time is very valuable. This leads her to suggest we focus on asking ‘what if’ questions. Like ‘if a person is bitten by a werewolf they turn into a werewolf, so what if a goldfish was bitten by a werewolf?’.
So how do you turn these ideas into linkable content? Hannah walks us through three very successful campaigns that she was apart of which are well worth reading. Some of the main takeaways are:

  • Ask yourself, do people want to read this? are they still interested? Without a willing audience, even the quirkiest idea will fall flat.
  • How will this content be shared? Think about if the content you make will be easy for journalists to use a springboard. What ideas can follow on from it? Make sure you have tons of different angles that can be taken and pitch all of them!
  • Get a deep understanding of the media landscape: create content that allows journalists to write the stories they want to tell
  • Keep your eyes and ears out for potential ideas. Collect, nurture and develop them.
  • Really work your ass off

Things aren’t always what they appear to be - Patrick Stox
June 13th, 2018
https://www.slideshare.net/SearchMarketingExpo/things-arent-always-what-they-appear-to-be-by-patrick-stox
This the the slideshow from Patrick Stox’s presentation at SMX advanced in Seattle this week. It revolves around the different things you should be checking to diagnose a technical issue that is causing a site drop. These are things that are unrelated to algorithm changes, seasonal, or tracking errors that have altered your traffic (and these should be checked first before you start a lengthy technical audit!).
Most often, something has been changed somewhere on the site, whether it be in the theme, the content, the tags, the structure etc. that is causing this downturn in rankings. Some of the interesting technical audit suggestions Patrick gives are:

  • Use crawl comparisons. archive.org and change monitoring to help track down when and where a change was made
  • Make sure to discover and fix broken redirect (tip! Make sure they all have valid security certificates.)
  • Make sure your internal links and clean and that your external links are still pointing to where you want them from where you want them
  • Use the inspect source function to make sure a coding mishap hasn’t broken your <head> early which can cause tags and other important information to be missed by GoogleBot.
  • If using JavaScript, make sure all tags are seen on Google first go around (pre-render or include them in the raw HTML) instead of sending them from inside the JavaScript which is rendered after the first crawl
  • Use  ‘&filter=o’ on your search urls to see which pages are being considered for the query. This can give you ideas of what could be potentially merged.

How to protect your contents credibility in the era of fake news - Noelle Schuck
June 12th, 2018
http://www.verticalmeasures.com/content-marketing-2/how-to-protect-content-credibility/
Noelle Schuck takes a look at what it takes to produce valuable, informative and most importantly  factual content. This requires that your content is sourced, and credibly sourced i.e. from somewhere that has expertise and experience on the topic at hand. This is particularly important in the modern world of search, where E-A-T is becoming such a strong ranking factor.
As Noelle suggests some good advice for good sourcing practises. For instance, it is better to use primarily sources (those who did the original research) rather than a secondary source. She goes on to explain the differences between reputable sources and how to determine if you are sourcing from the original content. Another useful tip that Noelle gives is that you don’t have to link to every source you use, it is more important to mention them by name and only to link when that will add extra value to your reader.
When Bounce Rate, Browse Rate (PPV), and Time-On-Site are Useful Metric... and When They Aren’t - Rand Fishkin/Whiteboard Friday
June 15th, 2018
https://moz.com/blog/useful-not-useful-metrics
This week’s Moz Whiteboard Friday talks about when bounce rate, browse rate and time-on-site metrics are useful, and when they are not. Some of his good tips of when not to use these metrics are when using them instead of, or without reference, to conversion actions and when using out of context for the type of site they are using. Another interesting point is that you need to consider how users got to that page when looking at these metrics. For instance, someone arriving from twitter is more likely to bounce that someone arriving after a deliberate Google search. Rand also covers the useful applications of these metrics which largely revolve around using them within a larger analysis and diagnostic process.
Conflicting Website Signals and Confused Search Engines - Rachel Costello
June 14th, 2018
https://www.slideshare.net/RachelCostello10/conflicting-website-signals-confused-search-engines-rachel-costello-technical-seo-at-deepcrawl/1
This presentation is from SearchLeeds where Rachel Costello spoke about how Google (and other search engines) make sense of the mismatched signals that the come across when trying to crawl and index the web. She starts by noting that since a lot of website owners have no knowledge of SEO, Google have algorithms and systems in place to try and figure out what must be very confusing url implementations. Some of the points that stood out to us were:

  • While Google will try and figure out your site for you (creating its own canonical pages etc.) it doesn’t get this right all the time, so don’t take the risk. Make sure your implementation is spot on so Google doesn’t have to do anything itself.
  • Canonical tags are not directions. they are signals. Moreover, remember that unique content on accepted canonical pages will be ignored. So only canonicalise pages that are identical or near enough. Conversely, Google may ignore canonical tags if they content appears to be significantly different on the pages.
  • Disallowed urls may still show up in search for a couple reasons such as if they are linked to internally or are included in an internal sitemap.
  • Canonical tags are ignored on pages with noindex tags.
  • The best way to really test what url selection Google is doing is to use an info query and search the cache.

Recommended Reading (Local SEO)

Google My Business Guidelines Are like Traffic Signals To New Yorkers - A (Very) Rough Suggestion - Mike Blumenthal
June 11th, 2018
http://blumenthals.com/blog/2018/06/11/google-my-business-guidelines-are-like-traffic-signals-to-new-yorkers-a-very-rough-suggestion/
In this humorous blog post, Mike Blumenthal comments upon an annoying but prevalent reality of local search - what GMB say you can/should be doing, and the reality of what people get away with, are very different.
Reviews and Lawsuits: You can win for losing and lose for winning - Mike Blumenthal
June 13th, 2018
https://www.getfivestars.com/blog/reviews-lawsuits-can-win-losing-can-lose-winning/
Mike Blumenthal writes about the two strange cases that have been in the news recently about lawsuits regarding reviews. The strange thing about the cases is that one makes perfect sense and is likely to be lost and the other doesn’t make much sense at all and might actually win. Mike uses these two cases to illustrate (in the extreme) the different effects review responses can have.
The first case is one we have reported on before - the small Californian winery who claim to be plagued by negative spam reviews from one person using multiple accounts. Sadly the actual law is against them, with federal statute 230 ensuring that Google or any other third party comment service (like a review platform) are not responsible for the content published on them by users. So the winery will probably lose. Mike does think that the lawsuit was still a good idea since it cost them very little and from an SEO perspective their presence has grown exceptionally well!
In the second case, a woman posted a negative review after a bad experience with her new doctor and the doctor has filed defamation charges against her. Defamation is one of the only ways to pierce the 1st amendment right to free speech in the NY State. After the charges were levied the review was taken down, however the doctor's practice is continuing on with their case at great expense. The company should have left it at that, but by pushing on with the case they have opened themselves up not only to a costly legal process - but they are now an open target online for more negative reviews which can only harm their business in the long run.
How does Yelp's review solicitation penalty work? - Joy Hawkins
June 14th, 2018
https://searchengineland.com/how-does-yelps-review-solicitation-penalty-work-299893
Joy Hawkins writes about the Yelp review solicitation penalty. The penalty was announced late last year (November 2017) and follows Yelp's tradition of being a lot harsher on non-legitimate reviews on the service compared to other platforms such as Google or Facebook. Joy notes a couple of interesting and important things about the penalty. Firstly, Yelp will notify the business before the penalty goes into effect and they will have 90 days to rectify the situation. The penalty will then continue indefinitely until the issue is corrected and the business stops soliciting reviews (which will also be taken down and the businesses ranking suppressed in the meantime). The lesson learned? Don’t solicit reviews for Yelp!
(As an added point of interest, it is not against Google's TOS to ask people for GMB reviews provided you don't offer an incentive.)
Proven link building strategies for local business - Rosie Murphy/Bright Local
June 15th, 2018
https://www.brightlocal.com/2018/06/15/proven-link-building-strategies-for-local-businesses/
This is a webinar from BrightLocal with Casey Meraz from Juris Digital, Matt Lacuesta from Earned & Owned, Myles Anderson from BrightLocal. The webinar discusses the tactics they employ to get the best backlinks they can for local search. Their ideas range from sponsoring local sports teams, monitoring existing brand mentions and building links from events and conferences. It is a fairly long webinar, but a couple of the cool tips we liked were:

  • Don’t seek backlinks from sites that already have a ton of sponsored content. If you think about it, this kind of link isn’t going to give you a whole load of value in terms of CTR and in general that site will have less link equity to share.
  • Prioritise links from big, well known and respected sites. A link from the New York Times can do a heck of a lot more than 20 links from lesser known sites!
  • Tutoring/organising conferences/creating resources in the community is also a great way to get quality links

Vancouver lawyer gets $1 in damages after suing Google Plus ranter - Jason Proctor
June 14th, 2018
http://www.cbc.ca/news/canada/british-columbia/lawyer-defamation-google-plus-1.4706419?cmp=rss
In this news artcle from CBC, Jason Proctor reports on a lawsuit in British Columbia brought by Lawyer Kyla Lee against a former client who left a negative review on her G+ profile. Lee argued that the negative review cost her firm upwards of $15,000 in lost revenue however the Judge in the case stated that no such evidence had been presented to him. The case was won by default as the defendant did not turn up to court, and the $1 award for damages was delivered with a stern warning from Justice Murray that the suit should not have been brought forward in the first place. Justice Murray went on to say that if a business opens up a G+ profile they should not expect glowing reviews all the time, and noted that prospective clients are also not expecting a perfect roster of good reviews either. This last point is interesting, as Google has suggested similar things when referring to how they use reviews as a ranking factor for local search.


Where to find Marie

We are now live with our podcast on iTunes and Google Play Music
iTunes
Google Play Music
Youtube (weekly live updates)


That's it for this episode! Stay tuned for our Youtube video (my channel is here. If you want to follow me on Facebook, here is my page.