emailSearch News You Can Use
Episode 107 - November 6, 2019

In this episode, we look at all of the recently published resources on BERT and pick out the important parts you should know. We also cover tips from the Google Webmaster conference. This is a great episode with a lot of awesome SEO tips, so be sure to read it all!


Marie’s Podcast for this episode

If you would like to subscribe, you can find our podcasts here:  iTunesSpotifyGoogle Play

Ask Marie an SEO Question

Have a question that you want to ask Marie? You can ask them on our Q&A with Marie Haynes Consulting page and Marie will answer some of the best questions each week in podcast!


In this episode:


Algorithm Updates 

November 3, 2019 - poss small update?

We had several clients with significant upticks in Google organic traffic starting November 3, 2019. Given that we are writing this on November 5, it is a bit early to make judgements just yet on what happened.

Most of the sites that saw increases starting November 3 had a similar pattern:

It is also important to note that each of our clients that saw some improvement November 3 are sites that truly have been actively working on improving quality. In some cases when we see a minor algorithm update, we feel that Google has changed some criteria in how they assess quality. In this case, we think it is possible that Google is recognizing the changes these sites made.

At this point we’ll call November 3, 2019 a minor quality update. We’ll let you know if this seems more significant.


Recent discussion and clarification on BERT

Summary of BERT information that is important to SEOs

We covered this in detail for our paid subscribers last week. This week, this info is available to everyone. We’ve also added new points that have come to light since last week. Here is a point form list of the important things we should all know about BERT.

  • The change impacts 1 in 10 queries.
  • It’s only affecting English queries right now (other than for languages/countries that have featured snippets.
  • It is designed to help Google understand queries more like a human would. For example, previously if you searched for “parking on a hill with no curb” Google would have put too much emphasis on “curb” and present results for parking on a hill with a curb. Now, they can appropriately understand the query.
  • BERT is a neural network that was trained on the entire English text of Wikipedia.
  • It is not currently being used for ads.
  • There is no real way to “optimize for BERT”. However, if you have been writing text that is geared for search engines rather than humans, this is likely going to be less effective now. 

Dawn Anderson’s amazing BERT article

We recommend that you read this great article on BERT. However, it is a BEAST of a post. The first half of the article is all about how BERT works to understand language and is a fascinating read. Here is a summary of the parts that are important for SEOs to know:

  • BERT is not an algorithmic update like Penguin or Panda. 
  • BERT does not judge web pages either positively or negatively, but rather, improves the understanding of human language for Google search.
  • BERT will be huge for conversational search and assistant.
  • BERT may eventually impact what is seen in People Also Ask boxes.
  • BERT could be important for entity determination. (Our note: This could be important in determining factors related to E-A-T)

There’s no such thing as a BERT score says Google

In a recent tweet, Danny Sullivan was asked about how BERT values a page. Danny clarified that BERT doesn’t assign value to pages. In a recent Hangout, John also said that he doesn’t believe that there is a specific way to optimize for BERT apart from writing naturally. Remember, BERT is meant to help understand languages and is broken down by sentence and phrase. Write naturally and hopefully you’ll reap the awards!


Some of the “misinformation” clarified 

This is a good thread on the recent discussion surrounding BERT and misinformation amongst SEOs. Dawn summarizes BERT nicely in her tweet emphasizing that the algorithm is meant to understand the context of words in a sentence. Doing this can help it understand and connect different sentences. 

She also shared this lengthy read on natural language processing and BERT for those who want more clarification. 


Can BERT be used for understanding our pages as well?

Apparently, BERT understands not only queries but context of the words on a page. More specifically, it can help to make sense of longer, intricate queries and solve language related tasks. 

Ultimately, BERT is a massive change that will help provide more targeted results to users by better understanding exactly what they are asking and how these words relate to each other.

One note we would like to make here is that we still as of yet do not have confirmation from Google that they are using BERT for more than just understanding pages. We have speculated in the past that Google will get better at understanding the nuances in our writing on pages. If this is true, we may see that Google gets better at recognizing that certain articles on sites are extremely helpful even if the rest of the site has questionable content. 


How to use BERT for meta descriptions

You’ve probably heard (correctly) that there is no way to “optimize” your site for BERT. But Hamlet Batista over at SEJ shows that we can use automated text summarization to generate meta descriptions at scale using either an extractive (splitting a text into sentences and ranking the sentences as summary content) or abstractive (creating original sentences to capture the essence of the text) approach. 

Hamlet walks us through how to create the code needed for text summarization for these purposes. Fascinating stuff!


MHC Announcements

It’s important to write, discuss and share! 

Working together is something SEOs have the unique ability to do. We share insight, knowledge and professional experience on an array of topics in a global capacity. Sharing your theory is great as it generates open discussion that can lead to new ideas. We need to ensure we encourage each other to embrace our analytical side when major changes occur in the industry. With all that has come forth surrounding misinformation about BERT, just remember to separate fact from theory and there is no harm done! 

Thanks for this great reminder, Marie! 


Happy Movember from MHC! 

Some of our guys have decided to defy their girlfriends and go ahead with growing moustaches in support of men’s health. If you’d like to support them, you can donate here. We’ll share some pictures with you as the boys get hairy! 


Google Announcements

Google’s official WordPress plugin, Site Kit, is now available for all

This past week Google announced that SiteKit is everyone for everyone and can be found in the WP plugin directory. If you own a WordPress site, oversee or work on a WordPress site, or are working on a plugin or hosting provider, there are several benefits for you. Check out the link below for more.


The Speed report in GSC is beginning its rollout

Not familiar with the speed report? It will sort similar URLs into groups based on speed. The data is based from the Chrome User Experience Report and will provide you with specific issues to help you optimize. The report also links to the Page Speed Insight tool to give you even more optimization techniques. Google recommends using the report to track improvements after you have fixed a problem to see the user experience. If you have feedback as this rolls out, add it to the user forum

We should also note that so far it would appear that this report is experimental and doesn’t seem to be entirely accurate. Based on the small sample sizes Google provides in the report, many have pointed out some initial flaws. 

We have noticed that almost all of our clients have scores that are low. Hopefully that’s not a reflection of our skills! Others have noticed this as well. We have one client who has an extremely light site that has a PageSpeed insights score of 99. Even this client has only “moderate” scores. We’ve yet to see a site with fast scores.

Hopefully this improved over the coming weeks!


Google has begun sending out more performance notices

Some SEOs have recently began to see some nifty performances sent their way via GSC. They basically indicate things like your top growing pages, as well as your total clicks and impressions for the last month. Barry Schwartz notes this these aren’t really new (being that they first surfaced this past August), but the change here is that Google looks to be sending more of these lately.


Google updates the JavaScript troubleshooting & JS SEO basics guides

Martin Splitt and Lizzi Harvey have added sections on best practices for web components to the JS troubleshooting guide and JS SEO basics guides. Did you know that rendering flattens shadow DOM content? Well it’s all explained in the updated guides! 


Fitbit has been acquired by Google

Google announced that they have entered a definitive agreement to acquire Fitbit. They want to move more in the direction of wearable technology, and say they will  work with Fitbit’s team to bring the best AI for wearable technology. 


Beta for Google’s new Ads Lead Form Extension 

Google has announced a new beta for their ads lead form extension that is designed to help webmasters with securing data from users who click on their ads. This allows you to get in contact with the users and follow up based on their interest in your ad on mobile.


Highlights from the Google Webmaster Conference Mountain View: Product Summit

This is a massive post by Jackie Chu highlighting several talks from the recent webmaster conference. There was a lot discussed so be sure to read her full post. For now here are some points we pulled out for you: 

  • What is web deduplication? It is the process of identifying and grouping similar web pages and sharing a representative canonical that users get fed. 
  • If your site is temporarily down, use 5xx codes instead of 2xx as Google may interpret it as your page being empty. 
  • Businesses that have complete knowledge graphs are seen as twice as reputable by consumers.
  • Set the custom crawl rate in GSC if you need Googlebot to slow down. 
  • As robots.txt are interpreted differently by each search engine, Google and Bing are looking to standardize this. 

For more on the conference, you can also find some good stuff over on SER


Controversy over how Google parses robots.txt

At SearchLove London, Will Critchlow from Distilled recently presented about an interesting paper he had written called, “Google’s robots.txt parser is misbehaving”. In it, he describes some interesting differences in how Google’s actual parsing of robots.txt works as compared to what is in the open sourced code available on GitHub. It gets complicated, so if you want to understand more about these differences, we would recommend you read Will’s post.

Gary Illyes addressed this controversy in a few tweets:

We thought this was an odd response, especially considering the Distilled post specifically says, in bold text:

Googlers: if you’re reading this, please help us clarify for the industry how Googlebot really interprets robots.txt.


Google SERP Changes

Some of the search operators appear to be buggy

Just like a number of SEOs, Glen Allsopp has noticed that having multiple search operators in play in a single query may be causing itself to completely cancel each other out and in turn yield zero results. This issue has been going on for a number of days now and although Google has received confirmation of the tweets from Glen, there is no official word on what may be causing this.


It would seem as though the rich snippets image bug has returned

Webmasters have taken to Twitter to voice their displeasure with rich snippets not working as promised... yet again! The main issue is that images are not showing up in the results and for their recipes, they’re not being found in the carousel.

According to Lisa Bryan from Downshiftology, even though she has correct schema, her images are not showing in the search results.

image not showing in search

This was apparently temporarily resolved by requesting reindexing in search console. Her images reappeared within twenty minutes. And then they disappeared again.

Others are having a similar issue.

John Mueller replied to the Twitter thread asking for direct examples to pass on to the team. He also noted that a Google Help forum thread be created to neatly organize everything. 

Note from MHC: We haven’t thoroughly investigated this situation. It certainly could be an issue on Google’s side. It is also important to note, however, that a site can lose rich snippets (including images) if for some reason Google has considered it lower quality. We’re not saying this is definitely what has happened here, and we do think that this is a bug, but it is something that should be considered if you have lost your rich snippets.


Google rolls out a new Knowledge Panel feature for universities

This is a really cool feature for universities. It will let them answer the top questions that are searched online as a knowledge panel. If you have a university page this is a feature you don’t want to miss out on! 


SEO Tips

What voice search says about the security of your site 

Not running with a HTTPS (secure) site? Well, count yourself out of contention for a chance to appear in voice search. 

As John Mueller says, in general HTTPS is a lightweight ranking factor and a free certificate from somewhere like Let’s Encrypt works just as well as a paid one. Now there’s no reason to not go secure!


John Mueller and Martin Splitt discuss site speed and SEO

This is a special episode of Ask Google Webmasters as it has our two favourite guys! They discuss page speed and rankings this week. When looking at page speed, they categorize pages as really good and really bad. They get data using lab data and real field data from users who have used those pages. 

A great piece of information came from this video. Martin Splitt said that there will never be a point where there is one magic page speed number they are satisfied with. You can’t break down speed into one factor as there are too many metrics involved. In other words, Google does not have an ideal page speed. Try to make it as fast as possible and don’t look for a perfect number as this will likely never happen. Martin says to look at what matters to you, your audience and your website. 


A quick lesson in self-referencing canonicals

Canonicals in general can be a tricky subject to understand! Thankfully Kristina Azarenko gives a superb explanation of self-referencing canonicals using a simple metaphor.

If you want to brush up on canonicals and get a better understanding of this tricky topic, we broke it down in simple terms over on our Wix SEO site. 


Your site speed matters

This is a cool case of how optimizing for page speed can be beneficial. What happened? They moved over from a slow host to a fast one and had a large increase in conversions! 


Context matters to Google!

This was an interesting tweet from Danny Sullivan which outlined the difficulties Google faces when it comes to understanding synonyms. If you want Google to understand the right synonym, be sure to give them plenty of signals to understand it rather easily!


Lily Ray discusses E-A-T strategy

If you’re looking for a dose of Halloween and E-A-T this podcast episode is for you! An E-A-T strategy is not one with quick returns, so listen to Lily give her expertise on how to make it one that will be beneficial in the long run. 


What impact does traffic have on the worth of a backlink?

Ultimately, Google says that traffic does not have an impact on the worth of a backlink. If your traffic diminishes, your backlink is not devalued and does not need to be disavowed. 

This question is another good reminder on why we should not blindly choose metrics to use to decide whether a link is unnatural. A link is unnatural if it is one that was built with the intention of manipulating PageRank. Google has a good list of examples here.


Google can override the key info you provide to the public if they think it’s in the users’ best interest 

If Google senses that customers are having a hard time reaching a business via their publicly displayed telephone number, Google might take the initiative to showcase a new number that they know customers can get a response from. Go figure!


If you are overloading a page with too much javascript...

Jen went on to clarify, “If you are overloading the page with lots of JavaScript, Googlebot will eventually interrupt the script and stop. They also specifically mention cryptocurrency miners.”


How Bing uses Quality Raters

We liked this tweet by Frédéric Dubut of Bing:


It sounds like Bing will be handing out more penalties soon

In the seven years that we have been helping site owners with manual actions, we have seen hundreds of Google manual actions, but only a small handful from Bing. The interesting thing is that in our experience, if Bing gives you a manual action you can often converse back and forth with a member from the Bing webspam team. They are generally quite willing to help.

Bing announced this week that they will be updating their webmaster guidelines to address “inorganic site structures”. They hope to tackle issues such as doorway pages and also subdomain leasing.

Bing’s article is quite interesting as it talks about how it is often difficult for an algorithm to determine whether a subdomain is actually a part of the root domain’s business model, or a completely different site. For example, yoursite.com and yoursite.co.uk should likely be treated as one website. But, site1.wordpress.com and site2.wordpress.com should be separate ones. Bing mentions that in almost every case where their algorithm got it wrong, “the most common root cause was that the website owner actively tried to misrepresent the website boundary.”

The post gives three examples of inorganic site structures:

  • Private Blog Networks (and other link networks)
  • Doorways and duplicate content
  • Subdomain or subfolder leasing

Ads and interruptions can destroy content experience 

Dejan Marketing surveyed 1500 Australians and asked them to describe what frustrates them when it comes to reading content online. Well, no surprise here, 40% said ads were their biggest frustration and second place went to interruptions (at 25%).

Dan and his team include a handful of written responses in this article on things like ads in the middle of the content, autoplay sounds and videos, etc. But, one thing that was really interesting were some of the written responses talked about items that factor into quality and E-A-T. Some of the other frustrations include the likes of:

  • The legitimacy and credibility of the content
  • The trust level of the author or publisher
  • The lack of citations
  • The obvious bias within the article
  • Clickbaity headlines or content

This is a light read but definitely an article to check out!


How to enable targetText on Chrome

Hats off to Lily Ray for this one!


Switching to mobile-first indexing should have little to no effect on rankings

When asked what Google defines as ‘ready’ when it comes to deciding when to move a site to mobile-first indexing, John said that it comes down to the overall site and a number of signals are checked. Google does not wait until a site is perfect (otherwise they’d be waiting until the end of time). Instead they take the initiative to move a site with the intention of it not losing any ranking.

John notes in his tweet that some sites ‘see a rise in search when switched over’ but there’s no general shift. You may in fact see a rise in rankings when switched to MFI, but losing rankings because of it really shouldn’t be happening. 


Eric Enge & Barry Schwartz discuss SEO testing, structuring content, and managing client expectations

This is part two of Barry and Eric’s chat, focusing more on SEO related topics as well as discussing Google. Eric gives a great reminder that what works for some clients doesn’t work for all, and this can contradict what Google puts out there. We loved the part where he says that SEOs have a tendency to over complicate things, as we often do ourselves. For example, when G announces a major update, don’t focus on the algo but what the purpose of it is. Publish great content, promote it and focus on the end game!


Google does not intent to add Google Discover referrer but there is an alternative

Asked whether SEOs could see Discover-specific referrer, John said that as far as he knows, there are “no plans to break that out out further.” In the meantime though, you can take advantage of the Google Discover performance reports in GSC.


Workaround to get the URL to show in Chrome

If you just aren’t excited for Google plan to hide the URL in Chrome, Brodie Clark and Valentin Pletzer have a trio of options as seen below.

Also if the breadcrumbs in the SERPs aren’t doing it for you, try this Chrome extension.


Updating your PHP could mean drastic speed increase

This is a case of upgrading clients sites to 7.3 resulting in a speed increase of up to 200% after upgrading. Worth considering! 


Google Help Hangout Tips

John Mueller discusses whether link velocity could hurt you 

In a recent Google Webmaster Help Hangout John says that getting too many links in a short period of time is not an issue but rather the type of links themselves is in question. The question itself was:

“If I started linking building and built 200 links in two days and before that, zero links in two years, would Google penalize the site?” ‘

John says if you’re building these links that don’t align with the Webmaster Guidelines, Google’s Algorithms may ignore or “take action” on the site. Or the manual actions team might take action and apply a manual action. So it’s not the speed at which you acquire the links but rather how you acquired the links.

Read John’s full answer below: 

“If you are building 200 links in two days, it is not so much the number of links you are building in a short period of time, it is really the matter that you are probably building links in a way that would not align with our webmaster guidelines. Our algorithms might take action on that or they might ignore those links. On the other hand, our manual actions team might take action on that and apply a manual action.

So it is not so much the number of links and the time you are talking about. It is more the type of links you are building and the type of link building you are doing in general. If you are dropping links on random sites, if you are buying links, if you are exchanging links using some weird link network. All of those things would be against our guidelines which would perhaps result in 200 links being built in two days. And since our webmaster guidelines, we try to catch those algorithmically and ignore them. And we do from time to time try to look at these things from a manual point of view and we may take manual action.

So really, it is not the number in the period of time, it is really the type of activity instead.“


What to do if your product page is outranking your category page

Make sure your category page is internally linked well within your site as this can help with its ranking. Additionally, make sure you’re not keyword stuffing when linking, as G will see this as a less reputable page and be more careful when showing it. 

To take it one step further, add a visual element like a banner to the page that will help users distinguish between the two types of pages, and guide them to the right one smoothly. 


Can spammy links hurt a site algorithmically? 

If Google looks at a site and sees a lot of spammy backlinks, it will assume there is malicious intent. It can then be demoted and lose visible rankings. So yes, spammy links can definitely hurt a site. Be sure to read our article on disavowing for more guidance on this.

Marie went further and defined that a spammy link is one intended to manipulate PageRank.  Here is G’s list of these. 


What constitutes a link scheme?

John describes a link scheme as one that excessive linking between a large scale of sites. Specifically when a group of sites work together to cross link to each other means they are likely  involved in a link scheme exchange. 


Your GSC numbers could be off 

When your data is fresh, you could see a different number than once it has settled down. This has changed since GSC now uses fresh content. 


Other Interesting News

Google has a new initiative aimed at a health record search tool for doctors

The new head of Google’s Health Initiative, David Feinberg, recently announced they are building a tool that will make it easier for Doctors to search medical records. The hope is to have a search function on electronic health records that allows medical professionals to easily find very specific and trustworthy medical information. It’s not clear how much of this tool is built but like many of the other tech giants, Google plans to continue to carve out a piece of the pie where technology and healthcare overlaps. 


Google says they intend to close the gap between the time it takes to index Google Shopping content and other content

When asking why shopping content gets indexed faster, the result is that this gap will be closed. 


Making Wikipedia more reliable 

Internet Archive has created a bot that will crawl Wikipedia’s sources and verify them. You can click a link and see a two-page preview as they have scanned over 50,000 books. They’re trying to give more context surrounding these sources. The Wayback machine has archived 387 billion pages since 2001, and scanned 3.8 million books. They have yet to scan all of the books cited by Wikipedia, but scan over 1000 a day and are making their way there. 


Firefox 72 to hide API prompts

This goes back to Firefox’s announcement in April to reduce the permission prompts users get. They have ran experiments to test this and with Firefox 72 it should change. This is good news since API prompts really shouldn't be happening without user interaction. “72 will require user interaction to show notification permission prompts.” Those without interaction will only be shown in the url bar as a small icon.


Yext's new tool ‘Yext Answers’ soon to help brands with queries on their sites 

Announced at the Onward conference, this tool is an internal search engine that will help your brand answer queries users are asking. It will give sites insight as to what people are searching for on their site, so they can better optimize to answer those questions. It has spent two years developing this tool with NLP technology to understand queries and provide answers in various formats depending on the question. 


Local SEO - Google SERP Changes

Google now displaying your short names

One of the more anticipated features in local this past year (ie. short names) has been live for several months now, but site owners can now see their short name appear on their GMB profile (though Joy Hawkins notes that this appears to be currently restricted to mobile browsers, not including the Maps app).


Local SEO - Tips

More short name available for local businesses

If you haven’t had luck securing your desired short name, you may be in luck! Over on the Local Search Forum it was mentioned that some previously unavailable names may be up for grabs. 


SEO Tools

Handy Chrome SEO bookmarks

Some really cool bookmarks in here from Glen Allsop! Some of the bookmarks he shows are how to see the backlinks to your current tab (pulling from all of your link finding tools), extract 100 urls from the SERPs, check for dupe content or plagiarism, get pagespeed insights from your current URL and so many more! 

There is a lot of detail in here, so in short, if you want to learn how to make these, I recommend reading this article.


Create your AMP stories for galleries

This one falls more in line with a process rather than a tool, but it’s a quick guide with just a handful of steps!


Recommended Reading

This Is What Happens When You Accidentally De-Index Your Site from Google Jeff Baker
https://moz.com/blog/accidentally-deindex-your-site
Nov 5, 2019 

This is an interesting case of an accidental de-indexing and what it meant for his site. Jeff includes his thought process when he suddenly lost all of his keyword rankings and what he did about it. 

He started by deleting the old sitemap, building a new one and uploading it to SC and requesting reindexing. What were the results? In one week he had a 33% drop in search presence. This number decreased with time, but they lost high conversions throughout the process. It took six weeks until they fully recovered. 

Even once pages were re-indexed, the search visibility was never restored. The hardest part was having them re-indexed. Some pages took eight to nine weeks. He definitely does not recommend trying this even as an experiment. 

 

How Might Google Extract Entity Relationship Information from Q&A Pages? – Bill Slawski
https://gofishdigital.com/entity-relationship-information/
Oct 29, 2019

This is another great patent focused article from Bill over at Go Fish Digital. This time, Bill is digging into a patent granted earlier this year regarding how Google could utilize websites that provide a question and answer format could be helping to feed Google’s Entity recognition and understanding. 

The basic premise is that a Q&A page will likely include information about an entity, and its relationship to another (included in the answer). For example, if someone asked “Who is Barack Obama’s wife?” on Quora, the answer displayed as “Michelle Obama” tells Google about those entities relationships to each other. The patent details how Google can pick up on these and most importantly, to determine which answer on the Q&A page is the most likely to contain the correct entity relationship information. After all, people don’t always answer correctly on these sites! 

 

The Featured Snippets Cheat Sheet and Interactive Q&A – Britney Muller
https://moz.com/blog/featured-snippets-qa
Oct 25, 2019

Brittany Mueller recently hosted a webinar on featured snippets, the broadcast didn’t go as smoothly as usual, although, Moz has since posted a detailed Q&A round-up of the most common questions. If you have questions about featured snippets, you’ll likely find your answer here. Also included is a helpful Featured Snippet Cheat Sheet. Here are a few questions we found particularly interesting: 

  • What are best practices around reviewing the structure of content that's won a snippet, and how do I know whether it's worth replicating? - Ask yourself, does it succinctly answer the query? Does it sound good as a voice answer? Does the page provide additional answers or information around the topic? The goal is to ask questions and determine why Google is finings this particular page more deserving than others. 
  • Can I win a featured snippet with a brand-new website? Generally, if you’re ranking on page 1, you have a chance. 
  • How should content creators consider featured snippets when crafting written content? Consider the searchers intent, are you providing the information in a way the user prefers ie. video, image, text etc. Then review all other results on page one, are there similarities? 

There are a number of great resources within this post and if you still have questions about featured snippets, consider asking in the comments section. The Moz community has always been a great place to learn from others expertise and they are more than happy to help. 

 

How we optimized our Crawl Budget by removing 72% of Skroutz indexed URLs while growing to 30 million sessions – Vasilis Giannakouris
https://engineering.skroutz.gr/blog/SEO-Crawl-Budget-Optimization-2019/
Oct 30, 2019

This is quite the #longread featuring a successful case study on crawl budget optimization for a massive e-commerce site that led to improved search performance. Crawl monitoring was so vital to the team of SEOs finding the biggest sources of inefficiencies, significantly reducing the bloat in the index, and ultimately improving the time-to-indexation for new URLs. One of their key takeaways: no-index does not mean Google stops crawling. Also be careful when consolidating pages as it can be a long process to recover rankings.

 

Effective SEO Implementation for Software Businesses – Viola Eva
https://www.searchenginejournal.com/seo-software-businesses/331019/
Oct 29, 2019

Do you own or work for a Software Company? In this amazing article by Viola Eva, they go over how you can structure your site and content in a way that works will for SEO and for Google. Viola goes into a technique of ‘siloing’ your website to create topical relevance to that will improve UX and help you compete in your niche. Some great advice provided by Viola include optimizing your homepage, creating a main navigation that supports your SEO efforts and creating content for your blog and how to optimize it. If you want to learn more about Site Structure and how to better optimize your site for SEO this is definitely a recommended read! 


Recommended Reading (Local SEO)

Spooky Review Strategies & Halloween Reputation Management Horror Stories – Garrett Sussman, Greg Gifford, and Claire Carlile
https://blog.grade.us/halloween-reputation-management/
Oct 31, 2019

Reviews are a major component in the decision factor for users. For local SEO, reviews are a major factor in your GMB position. This article dives into the importance of reviews and strategies to help you improve in this tricky area. 

Greg Gifford recommends creating a simplified site for reviews. Moving these into a landing page removes friction and funnels users. Carlile and Gifford recommend an “enhanced ask” technique that provides topics for users to write about. They provide some great horror stories but don’t worry, they come with recommendations. If you’re struggling with getting reviews, this is a must read article.

 

Tales of Horror from the Dark Side of Google My Business – Jason Brown
https://www.brightlocal.com/blog/tales-of-horror-from-the-dark-side-of-google-my-business/
Oct 31, 2019

Strap yourself in for this spooky tale of GMB horror stories. Jason Brown acts as the narrator and features a number of experts on such topics as unfavourable name changes, fear-inducing photos, rotten reviews, and bizarre Q&A answers. This is rather light read but serves as a good reminder of what @DamonGochneaur has said in that user-generated content can be a nasty merry-go-round if not carefully monitored.

 

#GTMTips: Track Outbound Link Clicks In Google Tag Manager – Simo Ahava
https://www.simoahava.com/gtm-tips/track-outbound-links-google-tag-manager/
Oct 30, 2019

This is a great tip for using Google Tag Manager to track outbound link clicks from your website. It’s not something many of us think to track, but there can be insights to be gleaned here. The setup within GTM is not overly complicated if you follow the steps outlined by Simo for creating the auto-event variable with an “outbound” component type.


Jobs


Who writes this newsletter?

When I, Marie, started this newsletter in 2012, it was a way for me to email each of you if Google updated the Panda or Penguin algorithms. As my business grew, the newsletter became an excellent training resource for my new staff. Now, our entire team at MHC is involved in writing newsletter content. 

I am so thrilled to see how the newsletter has grown. Thank you, readers, for all of your wonderful feedback!