On February 8, 2019 Gary Illyes did a Reddit AMA where he took questions from the SEO community. We hand picked the best questions and answers most relevant to SEOs and summerized them below. Lots of great questions about Mobile First Indexing, E-A-T, and other great SEO tips.
Does use of hreflang give you a ranking benefit?
This is an interesting question and I think the confusion is more about the internal vs external perception of what's a "ranking benefit".
You will NOT receive a ranking benefit per se, at least not in the internal sense of the term. What you will receive is more targeted traffic. Let me give you an example:
Query: "AmPath" (so we don't use a real company name ::eyeroll:: )
User country and location: es-ES
Your site has page A for that term in EN and page B in ES, with hreflang link between them.
In this case, at least when I (re)implemented hreflang, what would happen is that when we see the query, we'd retrieve A because, let's say, it has stronger signals, but we see that it has a sibling page B in ES that would be better for that user, so we do a second pass retrieval and present the user page B instead of A, at the location (rank?) of A.
Summary: There is no ranking benefit for using hreflang, BUT, it can help you get more targeted traffic to the appropriate pages.
How does Google use unlinked mentions?
Summary: Unlinked mentions are used for entity determination. Our note: This is likely connected to E-A-T. If you are consistently mentioned in places that are known as authoritative in your niche, then you are likely seen as having E-A-T in that subject.
Does Google use user interaction such as clicks, etc. as a ranking factor?
I don't think we ever said a straight "no", here's why:
Autocrat asks me on reddit if we use unicorns in ranking. Dumbass Gary says "no", because he thinks unicorns don't exist. An hour later Paul Haahr pings Gary saying that this obscure team in Atlantis is actually using unicorns. Now Gary has to go back to Autocrat and clarify his "no", but by that point Barry already wrote a clickbaity article and everything is melting externally.
So I can't NOT skirt, skip, duck and dive, I need to give you something usable that also doesn't cause our world to melt. And that answer is that we primarily use these things in evaluations.
Sure. When we want to launch a new algorithm or an update to the "core" one, we need to test it. Same goes for ux features, like changing the color of the green links. For the former, we have two ways to test:
With raters, which is detailed painfully in the raters guidelines
With live experiments.
1 was already chewed to bone and it's not relevant here anyway. 2 is when we take a subset of users and force the experiment, ranking and/or ux, on them. Let's say 1% of users get the update or launch candidate, the rest gets the currently deployed one (base). We run the experiment for some time, sometimes weeks, and then we compare some metrics between the experiment and the base. One of the metrics is how clicks on results differ between the two.
Summary: Yes, Google does use user interactions as a signal. However, Gary says that they are mostly used to test how effective an algorithm change is. Google will make a change, run a test to a small number of users, and then measure how well they engage with the new results to determine if the change was effective.
Can you be penalized for over optimizing internal links?
No, you can abuse your internal links as much as you want AFAIK.
Summary: No, there is no penalty for internal link over-optimization.
Our note: Some SEOs on Twitter have said that they have received a manual action for aggressive internal linking in the past. I, (Marie), have yet to see this, but it’s certainly possible. With that said, in 2013, Matt Cutts said that you would have to be seriously overdoing it in an incredibly spammy way in order for Google to take action on over optimized internal links.
What areas are most SEOs overlooking these days?
Google Images and Video search is often overlooked, but they have massive potential.
I'm generally happy with the questions unless it's some conspiracy bullcrap, or Expert XYZ said this and tested it with one page so it must be true. Those are the major reason I don't frequent Twitter anymore.
Summary: We should be paying more attention to use of images and video.
How does Google determine relevance?
We have this massive doc that touches on those things:
Basically we have an idea about determining relevance, we ask raters to evaluate the idea, raters say it looks good, we deploy it. Relevance is a hairy topic though, we could chat about it for days and still not have a final answer. It's a dynamic thing that's evolving a lot.
Summary: Gary referenced the Quality Raters’ Guidelines to answer this. While he didn’t tell us how Google determines relevance, he said that when they make changes to make results more relevant, they get quality raters to evaluate the results.
Our note: We believe that March 9, 2018 was a major update in which Google got much better at determining relevance. Many sites that saw drops stopped ranking for queries that they never should have ranked for in the first place.
Is E-A-T connected to links?
I guess that's a little oversimplified, but yeah.
Summary: Yes, E-A-T is connected to links. If experts in your field are linking to you, then you are more likely to be seen as an expert in that field as well.
Is it ok to use machine written content?
1. If you can generate content that's indistinguishable from that of a human's, go for it. I'm actually planning to write something up on how to use ML and NLP for SEO (when I have some time).
4. Lembeh, Indonesia, and Antarctica.
5. I'm not remoting. I just travel a lot for work and I'm causing trouble on Reddit and Twitter while at it.
Summary: It is ok to use machines to write content provided that it really looks and sounds like a human wrote it.
Is Web accessibility a ranking factor for images and videos?
1. Unfortunately, no.
2. I don't remember where the data is coming from, but not our publicly available tools, of which we have many for reasons that are beyond me.
3. we have a max size limit in Googlebot, it's relatively large, but I don't remember the exact number. Other than that, we don't have anything else that I know of. Googlebot doesn't randomly abandon a connection if it already started receiving bytes, though, unless the connection is reset or something. And we definitely index whatever we got from Googlebot
Summary: Web accessibility is not currently a ranking factor.
Does RankBrain look at user engagement signals?
I'll answer this quickly because I'm waiting for a plane and I'm bored (I'm supposed to answer questions tomorrow).
RankBrain is a PR-sexy machine learning ranking component that uses historical search data to predict what would a user most likely click on for a previously unseen query. It is a really cool piece of engineering that saved our butts countless times whenever traditional algos were like, e.g. "oh look a "not" in the query string! let's ignore the hell out of it!", but it's generally just relying on (sometimes) months old data about what happened on the results page itself, not on the landing page. Dwell time, CTR, whatever Fishkin's new theory is, those are generally made up crap. Search is much more simple than people think.
Summary: RankBrain uses machine learning to predict what a user is searching for based on their past search history.
Is there a way to get a non-US Googlebot crawler to visit your pages? (This would be the case for a site that redirects all US users to a particular page.)
I would be extremely surprised if anyone would bat an eye for you doing that. These corner cases are always interesting but ultimately you need to hack your way through our limitations (e.g. non-US googlebot is as good as nothing for reasons).
Yes, it's very rare except in Korea and China
Summary: You can exclude US Googlebot from being redirected. This should not cause a cloaking issue.
Is there a way that we can tell whether Google is passing signals in a redirect? For example, if you redirect pages to the homepage, is there a way that we can tell whether Google is passing signals to the homepage?
Aaaaaah!!! The source pages become 404 as they should. That makes sense, no?
Edit: as for signals, unless we can figure out they're 404 really, which is not easy, the signals will just pass on and you wouldn't get any report. If it dies become soft 404, then you get a report and no signals passed
Unless we stop visiting the page altogether, which if it's linked well likely won't happen (e.g. it's scheduled because it's linked), we will follow the links
Summary: (This is a good one!) If Google decides not to pass signals via a redirect like this, you should see soft 404 errors in GSC.
If you noindex a page, is it true that the links on that page will eventually be treated as nofollowed links?
I don't understand the question.
If the page is noindex, it's not "404 eventually", it's just not indexed. If you add "follow" directive to the page, Googlebot will discover and follow the links. That's it, I think
Again, paging /u/johnmu because I'm not aware of "noindex follow" becoming "noindex nofollow" artificially on our side. If it's not linked, then the page won't be scheduled and will be abandoned after a while, but if it's linked, that's unlikely to happen.
Summary: This was a tricky answer as John Mueller has said in the past that the links on a noindexed page will eventually become nofollowed. Gary says that if Google is still crawling the page, then the links will be followed.
If you are removing a large number of pages from your site, should you do it all in one go? Or in batches?
Removing pages will have ranking drop effects in most of the cases. That's because:
1. those pages might have gotten traffic that you're gonna lose.
2.those pages might have linked to other pages, so losing them will orphan or remove some PR from those target pages.
Batching sounds like a good idea; you can do damage control much easier,
Summary: Gary cautions that removing pages will also remove internal links, so be careful. But, if you are going to remove a large number of pages, do it in batches. If you see no traffic losses after a while, then go forth with the next batch of removals.
Is there a sandbox for new sites?
Yeah, there are anti-spam mechanisms that can do that to your site, but you really have to look spammy for them to trigger.
Also, no, I'm not going to talk about them for obvious reasons.
Summary: Yes, there is a sandbox that could prevent new sites from ranking well, but only for very spammy sites.
Is there a stigma that follows a site after a manual action so that it will never rank again?
Summary: No, there is no lasting penalty. Our note: In most cases, if a site gets a manual action, there are also algorithms at play that will continue to judge these issues unless they are completely cleaned up.
If you have two links from one domain, do both of them pass PageRank?
1. yes, they're both counted.
2. i really wish SEOs went back to the basics (i.e. MAKE THAT DAMN SITE CRAWLABLE) instead of focusing on silly updates and made up terms by the rank trackers, and that they talked more with the developers of the website once done with the first part of this sentence.
Summary: If you have two links from one domain, both of them are counted
If your site is slow on mobile, once moved to Mobile First Indexing, are you likely to see ranking drops?
Probably none. We already saw your pages are slow as a rheumatic snail before moving it to MFI, so you'll likely keep the old rankings.
Summary: You shouldn’t see ranking drops after the move to MFI, as Google has already been able to assess that your mobile site is slow. In other words, if there was a ranking demotion, it likely already happened.
If you like stuff like this, you'll love our newsletter!
My team and I report every week on the latest Google algorithm updates, news, and SEO tips.