It's another Webmaster Help Hangout, with our main man, John Mueller. This week he had some great insight on Links, Page Speed, UTM and Affiliate Links. As always the full video and transcript can be found below. Also if you find these recaps helpful then you should check out our newsletter! We collect and curate all of the best SEO articles of the week and sum them up for you in quick and digestible summaries.

If you previously had a manual action for unnatural links, will this stigmatize the site forever?

5:20

John Mueller Help Hangout February 19 2019

It's hard to speak sometimes but in general if a manual action is resolved then that manual action isn't affecting the site anymore. So there's no kind of hold back effect that would keep a website held back and like saying, well they did same things wrong and once therefore we’ll prevent them from showing up high in search, there's nothing like that. But in particular when it comes to links if you had a lot of unnatural links and perhaps your site was ranking because of those unnatural links than by cleaning those unnatural links up then of course there's a new situation that our algorithms have to kind of adjust to first. So that's something that could in to some extent have an effect there. And the way  I would approach this is just to continue working on the website how you normally would and make sure you don't run into that situation again that the web spam team runs across issues with regards to your links. So don't like go off and say like, oh I removed those links therefore I'll go buy some links from another site instead, and kind of make sure that what you're doing is really for the long term use of your website.


Summary: Once a manual action is removed, it is fully removed. However, unnatural links still have the ability to hurt a site algorithmically. This can take a long time to clear up.


 

If your site loads quickly for users, but Google’s PageSpeed Insights tool calls it slow, should action be taken?

11:29

John Mueller Help Hangout February 19 2019

So a lot of these metrics that you're looking at from lighthouse are primarily presented to you with regards to kind of the user facing side of things. So from from our point of view from a search point of view we take a variety of these metrics together to figure out how we should kind of be seeing the site with regards to speed but the kind of the absolute measurements that you see there in lighthouse are things that most likely have a stronger effect on how users see your site. So instead of asking Google with regards to SEO like is this site too slow I would look at it with users and kind of get their feedback instead and if you're saying that a your site is actually pretty fast for users, it provides the content fairly quickly then probably you're in a good state there.


Summary: What is most important is whether your pages load quickly for actual users.

Our note: We don’t know exactly how Google algorithmically determines which pages to demote for having slow load times on mobile. Our current thought is that we would like to see Page Speed improvements for sites that have low “first contentful paint” scores on Lighthouse as this number is an approximation of how long it takes for content that can be read to appear.


 

If a page was ranking for certain keywords, and you redirect it to a new page, will that page be able to rank for those keywords?

16:25

John Mueller Help Hangout February 19 2019

So it's essentially we're trying to if we can really transfer everything we one to one to the new domain then that new domain will be in place of the older domain. And it can rank for the old keywords and it can rank for the new keywords and again if you make significant changes during that move then of course that new state has to be taken into account and it's not just like we can forward everything to the new one because a new one isn't just a moved version of the old one.


Summary: When Google sees links pass through a redirect, they try to determine whether to pass link signals along. If the new page is similar, then there is a good chance that the link signals pointing to the previous page will pass along as well.


 

Is it ok to use UTM parameters on internal links?

17:33

John Mueller Help Hangout February 19 2019

I guess that's always a bit of a tricky situation because you're giving us mixed signals essentially. On the one hand you're saying these are the links I want to have indexed because that's how you link internally within your website. On the other hand those pages when we open them they have a rel canonical pointing to a different URL. So you're saying Index this one and from that one you're saying well actually Index a different one. So what our systems end up doing there is they try to weigh the the different types of URLs that we find for this content. We can probably recognize that this content these URLs lead to the same content. So we can kind of put them in the same group and then it's a matter of picking which one to actually use for indexing and on the one hand we have the internal links pointing to the UTM versions, on the other hand we have the rel canonical pointing to kind of the cleaner version. The cleaner version is probably also a shorter URL and nicer looking URL that kind of plays in inline with us as well but it's still not guaranteed from our point of view that we would always use the shorter.

So rel canonical is obviously a strong sign internal linking is also kind of a stronger signal, in that that's something that's under your control. So if you explicitly linked to those URLs and we think maybe you want them indexed like that. So in practice what what would probably happen here is we would index a mix of URLs some of them we would index the shorter version because maybe we find other signals pointing at the shorter version as well. Some of them we probably index with the UTM version and we would try to rank them normally as the UTM version.

In practice in search you wouldn't see any difference in ranking you would just see that these URLs might be shown in the search results. So they would rank exactly the same with UTM or without UTM and they would just be listed individually in the search results. And from a practical point of view that just means that in search console you might see a mix of these URLs. In the the performance report you might see kind of this mix. In the indexing report you might see a mix in some of the other reports. Maybe around the AMP or structured data if you use anything like that you might also see this mix. You might also see in some cases situation where it swaps between the URLs so it might be that we index it with UTM parameters at one point and then a couple weeks later if we switch to the cleaner version and we say, well probably this cleaner version is better, and then at some point later on or algorithm to look at it again and say well actually more signals point to the UTM version we'll switch back, that could theoretically happen as well.

So what I would recommend doing there is if you have a preference with regards to your urls make sure that you're being as clear as possible within your website about what version you want to have indexed. With UTM parameters you're also creating the situation that we'd have to crawl both of those versions so it's a little bit more overhead if it's just one extra version that's probably not such a big deal. If you have multiple UTM parameters that you're using across the website then we would try to crawl all of the different variations which would in turn mean that maybe we crawl, I don't know, a couple times as many URLs as your website actually has to be able to keep up with indexing. So that's probably something you'd want to avoid. So my recommendation would be really to try to clean that up as much as possible, so that we can stick to the clean URLs. To the URLs that you want to have indexed instead of ending up in this state where maybe we'll pick them up like this, maybe we'll pick them up like this, and in your reporting it could be like this, it could be like this. You have to watch out for that all the time. So keep it as simple as possible.


Summary: Be careful. Internal links show Google which pages are important on your site. If you link to pages with utm parameters, you may end up having Google crawl multiple urls with the same content. This is not a problem for a couple of pages, but can potentially affect Google’s ability to crawl the site if done on a large scale.

Make it very clear which is the canonical version so that all of the signals pass to that page. The following will help:

 

  • Use a canonical tag on all versions of the page that points to the non-utm url
  • Make sure that the content on each version of the page is the same


 

Is it ok to show Googlebot different anchor text on links than actual users see?

23:08

John Mueller Help Hangout February 19 2019

So Googlebot should be seeing the equivalent version of the page as users would be seeing. So if this is something that's useful for users I'd recommend showing it to Googlebot as well. It's something where if you're talking about the speed that it takes to to render these pages it's worth keeping in mind that we look at speed on kind of a holistic way we don't just look at the speed of a page as Googlebot is fetching that page when it comes to speed with regards to using speed as a ranking factor. For crawling of course having the pages that are available very quickly that helps us quite a bit so that makes sense to kind of make things fast for Google for crawling at least. One thing I'd recommend here though is try to avoid special casing Googlebot if if at all possible because it does mean that maintenance is a lot harder. If you're doing special for Googlebot then it's a lot harder to tell if Googlebot is seeing errors on the page and users are seeing normal content and Googlebot starts dropping those pages from search because of those errors. So ideally treat them a lot the same as users. One way you could do that here is to use some kind of JavaScript to just asynchronously load that extra information. So that's usually pretty easily doable probably less effort than cloaking to Googlebot in a case like this.


Summary: No. You do not want to make special cases where Googlebot sees different content than a regular user.

Our note: This could be considered cloaking and could cause pages to drop out of the index.


 

If there are links pointing at Page A and Page A is canonicalized to Page B, does Google see those links as pointing to Page B?

26:18

John Mueller Help Hangout February 19 2019

So from our point of view there are multiple things I've come into play here. I think first of all those pages should be equivalent if you're saying that there's a canonical for that page it should be equivalent to the final page as well. So it shouldn't matter which of these pages is used for forwarding links because you're essentially telling us these pages are the same we can treat them the same. So it shouldn't matter for our algorithms like if we pick up those links on that page or we pick up the links on the other page. So I I would also assume there that not always kind of like in the other question with UTM parameters we would pick the URL that is specified as canonical as the one that we actually index. So if those links are different across versions of pages then that's something where we might not pass PageRank in the way that you're expecting. So all of that kind of comes together and from my point of view what I'd recommend there is just really making sure that those pages are equivalent. So that you don't have to worry about from where, which links are passing PageRank. But rather assume that it could be this page, it could be the other page, the rel canonical is a strong signal for us but it's not a directive that prevents us from using that page completely.


Summary: If the pages are truly equivalent, then yes, links pointing to Page A would help support Page B.


 

Should you create a separate website for each country you want to be seen in?

26:58

John Mueller Help Hangout February 19 2019

So the short answer there is no that's a terrible idea. Especially if you don't really have content that's unique to every country in the world. In particular if you're using hreflang between the different country and language versions. You need to keep in mind that hreflang does not make those pages rank higher in those countries it just swaps out those URLs. So you still have to be able to rank in those individual locations. Which means if you have one piece of content and you split it up across every country in the world and multiply it by every language then you'll have diluted your contents significantly across a large number of URLs. Which means that any piece of content there will have a lot more trouble being shown in the search results. Because instead of one strong page that we could show potentially worldwide,  we have a ton of different pages that all show more or less the same content and none of them are particularly strong. So that's something where when it comes to an internationalization strategy I would strongly recommend getting help from people who have done this successfully for other websites. So that you can really weigh kind of the pros and the cons there. On the one hand you have the advantage of being able to target individual countries and languages with content that's really uniquely kind of specialized for them and on the other hand you have to weigh that if you're diluting the content too much then all of those versions will have a lot harder time to be visible in search. So on the one hand targeting well on the other hand kind of making sure that the versions that you do have available are strong enough to actually be be visible in search and my general thought there is to err on the side of using fewer versions rather than using more versions. So unless you have a really strong use case for having content specifically targeted for individual countries and I err on the side of well maybe one global website is better rather than splitting things up.


Summary: In most cases, it makes sense to just have one website and make proper use of hreflang to help Google surface the correct version. There may be some cases where it would make sense to have websites with separate country level TLD’s. This is often a tough decision. John recommends consulting with an SEO who has experience in internationalization.


 

Can you use both rating schema and Q&A schema on the same content?

29:33

John Mueller Help Hangout February 19 2019 Sure I don't see offhand why not. The main thing to keep in mind is that the structured data should be focused on the primary content of the page. So I don't know how exactly you would kind of combine the rating and the QA schema on a page but maybe it's something like you have questions and answers to a product and you have ratings at that product as well. That might be an option there but in general it's not that these types of markups are exclusive and you can only use one type you can combine multiple types of markup.


Summary: Yes


 

Are affiliate links seen as a sign of a lower quality website?

41:05

John Mueller Help Hangout February 19 2019

So in general this isn't as far as I know we don't explicitly go into the site and say, well there are links that look like affiliate links therefore we will treat this website as being lower quality. In general the main issues that I see with regards to affiliate websites is that they tend to just be lower quality website. So it's not so much a matter of where the check out flow is but rather that overall the content is often kind of low quality and because of the low quality content that's something that our algorithms might pick up on and say, well this is probably not the most relevant result to show in the search results, and there can be really high quality affiliate websites as well which are good. So it's not so much a matter of is there an affiliate link or not but rather like what about the rest of the website? Is this something that would be relevant to show to users or is there something perhaps problematic there? I think, at least as far as I know, that would apply across the board so it wouldn't really matter what what the specific topic area is of the website but in general there are some really good affiliate sites and there's some really really terrible affiliate sites. So it's more a matter of is a site good or is a site terrible?


Summary: The presence of affiliate links is not a sign of low quality. However, if you have an affiliate site, in order to rank well you need to be providing enough value so that Google will want to rank your content alongside of the original vendor’s content.


 

If your site gets an influx of irrelevant links, should they be disavowed?

47:01

John Mueller Help Hangout February 19 2019

Sure you can do that. Yeah that's that's kind of what the the domain entry in the disavow file is for, that way it's like all of these millions or thousands of links you can just say everything from this site I just don't have anything to do with that. Usually what what happens in cases like that where you see this completely random site linking is that site probably got hacked and for whatever reason someone decided to drop all those links there when hacking the web site and usually we're pretty good at picking that up and trying to keep the state from before they hack in our systems. So probably we're already ignoring those links, it's possible that maybe or even indexing the website from before rather than with the hacked content. So that's something where usually I wouldn't worry about that too much because these kind of hacks are really common and we have a bit of practice trying to deal with that. But if you're worried about this you know like oh this is like really crazy and I think instrument repair links probably aren't something where you're like saying, oh this like will kill my website if I'm associated with violins, maybe adult content is something where you'd be more cautious about then I would just use a disavow file submit that and you're sure that these are not taken into account.\


Summary: In most cases like this, Google is already ignoring the unusual influx of links. It is interesting to note though, that John recommended that it may be a good idea to disavow if these links are adult in nature, or if they are relevant to your site. For example, if your music repair site received thousands of unnatural links pointing to it with the anchor “violin repair” then you may want to disavow those.


 

 

If you like stuff like this, you'll love my newsletter!

My team and I report every week on the latest Google algorithm updates, news, and SEO tips.

Full Video and Transcript

 

 

Question 1:25 - Would a X robots meta HTTP headers stop Google following a location redirect in like a 301 or 302?

Answer 1:37 - No. So the nofollow would only apply to links on that page and since it's a server-side redirect we wouldn't even look at the content of the page. So we there would be no links to follow so that wouldn't change anything.

Question 1:58 - One of our clients is an e-commerce store. They've been blocked by several ISPs, mobile ISPs and we wondered how about what impact their organic rankings.

Answer 2:15 - So I guess the main thing from from our point of view there would be whether or not Googlebot would also be blocked and if Googlebot is not blocked then we would be able to crawl an index the content normally. But of course if Googlebot is also blocked because it's, I don't know blocked, in the US for example then that would of course result in those pages dropping out of search. But if it's just individual users I don't have access to that and indexing and crawling otherwise works normally that's usually less of an issue of course you have kind of the indirect effects there that these users wouldn't be able to recommend the site and we wouldn't be able to pick up those recommendations because they're not there from those users. So obviously if you're blocking most of the users on your that would go to your website that would probably have a long-term effect on the site.

Answer 3:25 - And this is something that some sites even do on purpose not particularly the mobile users versus desktop but some sites will say, I don't have any anything that I can offer users in these individual countries or for whatever policy reason my content is illegal in Switzerland because it's not neutral enough or whatever, then that site might choose to block users in those countries and that's still something we have to deal with. So those users wouldn't be able to recommend it there might be some long-term effects there but if we can still crawl and index the content then that's generally fine.

We had a manual an action due to links. We’ve been slowly climbing back to the bottom of page 1 sometimes the top of page 2. I was wondering if it’s due to the lack of links now or is there something we can do to regain that trust so we can get back to our original position?

Answer 5:20 - It's hard to speak sometimes but in general if a manual action is resolved then that manual action isn't affecting the site anymore. So there's no kind of hold back effect that would keep a website held back and like saying, well they did same things wrong and once therefore we’ll prevent them from showing up high in search, there's nothing like that. But in particular when it comes to links if you had a lot of unnatural links and perhaps your site was ranking because of those unnatural links than by cleaning those unnatural links up then of course there's a new situation that our algorithms have to kind of adjust to first. So that's something that could in to some extent have an effect there. And the way  I would approach this is just to continue working on the website how you normally would and make sure you don't run into that situation again that the web spam team runs across issues with regards to your links. So don't like go off and say like, oh I removed those links therefore I'll go buy some links from another site instead, and kind of make sure that what you're doing is really for the long term use of your website.

 

Question 7:32 - The question is generally about a website that moved to a different domain name and they moved a different framework and also to a different platform through an angular site rather than a static HTML site and since they did this move they've been seeing significant drop in rankings, for in visibility in search in general.

Answer 7:57 - So I I've looked at a bunch of threads from from the website and trying to follow up with somethings internally to see what exactly is happening here and it's kind of a complicated situation where I don't really have a straightforward answer. In particular there's the move from kind of a country code top-level domain to the generic top-level domain so some kind of geo targeting information is a little bit lost there. There were some issues in the beginning with being able to index the content at all. In particular on the new site and there's some significant changes with the way that the content is provided on the new site compared to the old site. So overall there's lots of pretty significant changes along the way and some of them were problematic like not being able to index the content and some of them were just changes how they might normally happen on a website where you would change the content significantly for example. And I think what is happening here is all of these changes have made it fairly hard for our algorithms to figure out what the new stable stage should be for this website and it's probably just something that's taking a lot longer than a normal site move would take. So this is I think one of those cautious situations where you want to make a bunch of changes and by making fairly significant changes all at once you're causing a lot of confusion with regards to our algorithms. Where it might make sense to take these changes step by step to test them individually to see that actually every step in between is working as it should be so that you're sure that the whole chain of changes that you make don't necessarily result in bigger negative effects with regards to search. So I guess my recommendation here would be to keep at it and keep looking at at the site working on the site to make it better. I think from an indexing point of view you're in a pretty good state there you're using some kind of server-side pre-rendering which makes it possible for us to pick up the content fairly quickly. It looks like you've made some significant changes on the speed of the website overall so that's positive change as well. And I suspect that all of these things will add up over time to kind of be reflected in search as well. I wish these kind of changes would be able to be processed a little bit faster so I've been pinging some some folks in the search engineering side as well to double-check that everything is working as expected there. But in in general with with a lot of kind of bumpy changes on a site and moving to a different domain all of these things can make it a bit tricky to do a site.

 

Question 11:03 -  We have good first meaningful content two seconds but our time to interactive is 12 to 15 seconds is affected by a large amount of scripts according to lighthouse even though it loads very fast for users. We're about to launch a new website and we wonder if it's crucial to fix this before launch or can it wait a few months, how does Google view time to interactive?

Answer 11:29 - So a lot of these metrics that you're looking at from lighthouse are primarily presented to you with regards to kind of the user facing side of things. So from from our point of view from a search point of view we take a variety of these metrics together to figure out how we should kind of be seeing the site with regards to speed but the kind of the absolute measurements that you see there in lighthouse are things that most likely have a stronger effect on how users see your site. So instead of asking Google with regards to SEO like is this site too slow I would look at it with users and kind of get their feedback instead and if you're saying that a your site is actually pretty fast for users, it provides the content fairly quickly then probably you're in a good state there.

 

Answer 12:24 - Google just explained in a white paper released a few days ago that it uses PageRank via links across the web to evaluate authoritativeness and trustworthiness algorithmically. Can we assume that expertise is primarily evaluated via content quality algorithmically? Can you elaborate on this at all?

Answer 12:51 - I don't have any insight there. So that's something I don't really have anything specific to that there. I just saw that white paper I don't know yesterday or day before as well so.Seems pretty interesting but of course it's it's a fairly long paper and there's lots of different topics in there and PageRank is just more or less a side comment there. So I wouldn't like say everything is just PageRank.

Question 13:23 - A few months ago the Google Chrome labs team released quicklink. Which offers faster subsequent page loads by prefetching in viewport links during idle time. I know this is beneficial for users but would it have any effect on Googlebot and ranking or is every page that Googlebot visits viewed as a first-time visit, clean slate?

Answer 13:45 - You're correct. So every time Google loads a page that's seen as kind of a first-time fresh visit on that site or on that page we generally don't keep any cookies it's not that we keep session state where we would say like, oh we loaded this one page and we're following the links here therefore we'll keep that state and like click on those links to get to those new pages, but rather we'll find those URLs and then fetch those URLs individually. So in in a case like this I don't see any positive or negative effect on Googlebot here it does sound like it's something that would significantly speed things up for users so that's that's a good thing and you probably see some indirect effects there and that when users are able to use your site very quickly that often they spend more time on the site they look around a little bit more and in turn it's probably more likely that they'll be able to recommend that site to others as well which is something that we might be able to pick up.

Question 12:52 - How much time can it take to restore rankings if I move a site from old domain to a completely new one and ensure that all 301 redirects are in place?

Answer 15:02 - So this is something that can happen very quickly if you do a clean site move from one domain to another where everything is essentially the same where we can recognize this old URL is redirected to the same URL on the new domain then we can generally pick that up fairly quickly and usually you'll see in kind of the indexing overview in search console you'll see that one goes down just about as the other one goes up and things shift over with maybe a day or so of a kind of a bump in between. On the other hand if you make significant changes across your website so different URLs different frameworks different kinds of content, then all of that can make it a lot harder for us and that we really have to re-evaluate the new site completely and based on the new website figure out how should we position this site. That's kind of the the differences there but if it's really just the clean one URL moves to the same URL on another domain kind of change that's really no problem.

 

Question 16:18 - Will a new domain inherit SEO juice and also rank for new keywords?

Answer 16:25 - So it's essentially we're trying to if we can really transfer everything we one to one to the new domain then that new domain will be in place of the older domain. And it can rank for the old keywords and it can rank for the new keywords and again if you make significant changes during that move then of course that new state has to be taken into account and it's not just like we can forward everything to the new one because a new one isn't just a moved version of the old one.

Question 16:58 - A question regarding parameter URLs with UTM links. Will these links dilute the link value if they're heavily linked internally? We're getting pages indexed with parameters where the canonical is pointing to the preferred version how will it affect in the long run if we're linked within the website with 80% parameters and 20% clean URLs.

Answer 17:33 -  I guess that's always a bit of a tricky situation because you're giving us mixed signals essentially. On the one hand you're saying these are the links I want to have indexed because that's how you link internally within your website. On the other hand those pages when we open them they have a rel canonical pointing to a different URL. So you're saying Index this one and from that one you're saying well actually Index a different one. So what our systems end up doing there is they try to weigh the the different types of URLs that we find for this content. We can probably recognize that this content these URLs lead to the same content. So we can kind of put them in the same group and then it's a matter of picking which one to actually use for indexing and on the one hand we have the internal links pointing to the UTM versions, on the other hand we have the rel canonical pointing to kind of the cleaner version. The cleaner version is probably also a shorter URL and nicer looking URL that kind of plays in inline with us as well but it's still not guaranteed from our point of view that we would always use the shorter.

So rel canonical is obviously a strong sign internal linking is also kind of a stronger signal, in that that's something that's under your control. So if you explicitly linked to those URLs and we think maybe you want them indexed like that. So in practice what what would probably happen here is we would index a mix of URLs some of them we would index the shorter version because maybe we find other signals pointing at the shorter version as well. Some of them we probably index with the UTM version and we would try to rank them normally as the UTM version.

In practice in search you wouldn't see any difference in ranking you would just see that these URLs might be shown in the search results. So they would rank exactly the same with UTM or without UTM and they would just be listed individually in the search results. And from a practical point of view that just means that in search console you might see a mix of these URLs. In the the performance report you might see kind of this mix. In the indexing report you might see a mix in some of the other reports. Maybe around the AMP or structured data if you use anything like that you might also see this mix. You might also see in some cases situation where it swaps between the URLs so it might be that we index it with UTM parameters at one point and then a couple weeks later if we switch to the cleaner version and we say, well probably this cleaner version is better, and then at some point later on or algorithm to look at it again and say well actually more signals point to the UTM version we'll switch back, that could theoretically happen as well.

So what I would recommend doing there is if you have a preference with regards to your urls make sure that you're being as clear as possible within your website about what version you want to have indexed. With UTM parameters you're also creating the situation that we'd have to crawl both of those versions so it's a little bit more overhead if it's just one extra version that's probably not such a big deal. If you have multiple UTM parameters that you're using across the website then we would try to crawl all of the different variations which would in turn mean that maybe we crawl, I don't know, a couple times as many URLs as your website actually has to be able to keep up with indexing. So that's probably something you'd want to avoid. So my recommendation would be really to try to clean that up as much as possible, so that we can stick to the clean URLs. To the URLs that you want to have indexed instead of ending up in this state where maybe we'll pick them up like this, maybe we'll pick them up like this, and in your reporting it could be like this, it could be like this. You have to watch out for that all the time. So keep it as simple as possible.

 

Question 21:56 -  Will there be any ability to test and view desktop renders with screenshots in the URL inspection tool in search console?

Answer 22:05 - I get these kind of questions all the time. It's like will you do this in search console or will you do that. And in general we try not to pre-announce things so I don't really have any insight that I can give you there with regards to what will happen in the future. It does seem like something that is kind of a missing aspect of the tool in that it's focusing on mobile at the moment but it might be nice to actually test the desktop version as well. So I'll definitely pass that on to your team to make sure that that's on their radar somewhere.

 

Question 22:41 - On one website with a couple million pages we have a sidebar which has internal links in it and next to each there's a number which represents how many sub pages that link leads to. Sort of as an information informative visual aid and generating those numbers takes a lot of time is it okay to remove those numbers for the front page when Googlebot is detected?

Answer 23:08 - So Googlebot should be seeing the equivalent version of the page as users would be seeing. So if this is something that's useful for users I'd recommend showing it to Googlebot as well. It's something where if you're talking about the speed that it takes to to render these pages it's worth keeping in mind that we look at speed on kind of a holistic way we don't just look at the speed of a page as Googlebot is fetching that page when it comes to speed with regards to using speed as a ranking factor. For crawling of course having the pages that are available very quickly that helps us quite a bit so that makes sense to kind of make things fast for Google for crawling at least. One thing I'd recommend here though is try to avoid special casing Googlebot if if at all possible because it does mean that maintenance is a lot harder. If you're doing special for Googlebot then it's a lot harder to tell if Googlebot is seeing errors on the page and users are seeing normal content and Googlebot starts dropping those pages from search because of those errors. So ideally treat them a lot the same as users. One way you could do that here is to use some kind of JavaScript to just asynchronously load that extra information. So that's usually pretty easily doable probably less effort than cloaking to Googlebot in a case like this.

 

Question 24:47 - Does link equity pass through links on canonical pages or is it just ignored and everything flows to the canonical? So I think the question is more that you have one page it has a rel canonical pointing to a different page and the question is will those links on that original page kind of work pass PageRank or will only that pay the links on the specified canonical page pass PageRank?

Answer 25:18 - So from our point of view there are multiple things I've come into play here. I think first of all those pages should be equivalent if you're saying that there's a canonical for that page it should be equivalent to the final page as well. So it shouldn't matter which of these pages is used for forwarding links because you're essentially telling us these pages are the same we can treat them the same. So it shouldn't matter for our algorithms like if we pick up those links on that page or we pick up the links on the other page. So I I would also assume there that not always kind of like in the other question with UTM parameters we would pick the URL that is specified as canonical as the one that we actually index. So if those links are different across versions of pages then that's something where we might not pass PageRank in the way that you're expecting. So all of that kind of comes together and from my point of view what I'd recommend there is just really making sure that those pages are equivalent. So that you don't have to worry about from where, which links are passing PageRank. But rather assume that it could be this page, it could be the other page, the rel canonical is a strong signal for us but it's not a directive that prevents us from using that page completely.

 

Question 26:50 - Is it a good idea to create a website for every country in the world?

Answer 26:58 - So the short answer there is no that's a terrible idea. Especially if you don't really have content that's unique to every country in the world. In particular if you're using hreflang between the different country and language versions. You need to keep in mind that hreflang does not make those pages rank higher in those countries it just swaps out those URLs. So you still have to be able to rank in those individual locations. Which means if you have one piece of content and you split it up across every country in the world and multiply it by every language then you'll have diluted your contents significantly across a large number of URLs. Which means that any piece of content there will have a lot more trouble being shown in the search results. Because instead of one strong page that we could show potentially worldwide,  we have a ton of different pages that all show more or less the same content and none of them are particularly strong. So that's something where when it comes to an internationalization strategy I would strongly recommend getting help from people who have done this successfully for other websites. So that you can really weigh kind of the pros and the cons there. On the one hand you have the advantage of being able to target individual countries and languages with content that's really uniquely kind of specialized for them and on the other hand you have to weigh that if you're diluting the content too much then all of those versions will have a lot harder time to be visible in search. So on the one hand targeting well on the other hand kind of making sure that the versions that you do have available are strong enough to actually be be visible in search and my general thought there is to err on the side of using fewer versions rather than using more versions. So unless you have a really strong use case for having content specifically targeted for individual countries and I err on the side of well maybe one global website is better rather than splitting things up.

 

Question 29:25 - Can we use both rating schema and question-and-answer schema for question answers

Answer - 29:33 - Sure I don't see offhand why not. The main thing to keep in mind is that the structured data should be focused on the primary content of the page. So I don't know how exactly you would kind of combine the rating and the QA schema on a page but maybe it's something like you have questions and answers to a product and you have ratings at that product as well. That might be an option there but in general it's not that these types of markups are exclusive and you can only use one type you can combine multiple types of markup.

 

Question 30:41 - So our site had maybe like 5,000 Google visitors per day for several months now we're down to under 500. This was due to a sudden, just happened in one day, we think it's due to an automatic security penalty that we got removed within one business day and we just haven't seen any change within three weeks. So I'm wondering how long was something like that it might take to recover from and if there's anything else that would cause just a sudden one day drop like that?

We got a message in search console that social engineering content was detected. It turns out that was due to our email service provider there click track my software was was automatically flagged because I guess another clients was using it so we were we got a manual review and within a business day and they said it was ok but you know.

Answer 31:48 - Ok so if there was a manual review then that would be resolved. So it's not that there is kind of a hold back after manual review that says, oh we have to be careful with this website, but that should be resolved there. I know It’s kind of hard to say just in a general case with regards to a site like that there there are sometimes algorithmic changes that happen that roll out we're bigger change happens with within a day or so. So that might also be playing a role here it might also be that there's something else kind of playing in with the site there. So what what I generally recommend doing is maybe starting a thread in the webmaster help forum with the URL of your site and some of the queries where you're seeing changes. So not just like we had this many queries and now it's like down to this but rather like people are searching for our company name they're still finding our company name but for this particular kind of query we're no longer visible anymore. So that that kind of thing is it's really useful to post in the webmaster help forum and usual also when when people can't find an answer for that, the experts there are able to escalate that on to us so that we can take a look at that as well.

Question 33:24 - So they have multiple sites in German for different countries and they have the a hreflang setup. It sounds like instead of properly. So this is just what the example URLs, it's hard to say but they're saying that they're not seeing a lot of indexing of these URLs across the different country versions so what what could be happening here?

Answer 33:54 - So usually when I see questions like this it boils down to our algorithms looking at these pages and saying well these pages are essentially duplicates of each other. So we can index one of these versions and we can show that one to users because it's all the same. So in particular in in German this is something I see quite a bit I imagine in some other languages it's similar probably in Spanish where there multiple spanish-speaking countries or English of course as well where the content is essentially the same and it's just targeted for different countries. So what what happens here is from an indexing point of view we would probably, at least potentially, fold these URLs together so the multiple German versions we pick one German version and say this is the canonical for the German version with the hreflang markup we'd still be able to show the other URLs though. So we might index maybe the Swiss German version and if a user in Germany we're searching we would be able to swap out that URL against the the German version in the search results. But in the indexing report in search console you would only see that URL being indexed for the Swiss version which is the one that we chose the canonical. So you wouldn't see that for the German version, in practice that can be okay if these pages are really exactly the same because we're swapping out the URL users are getting to the right version that's pretty much okay and the difficulty can be that if you have something in the title or if you have prices or other structured data on those pages, that might be something that would confuse users in that a user in Germany might be searching we show the German URL but in the snippet it has Swiss currency as a price somewhere. So that could be confusing to people so there's usually two approaches to that on the one hand you could take the approach and say well folding these URLs together is fine it kind of makes those euros a bit stronger for me that's okay. Then you might go in there and say well I'll make sure that these pages are generic that cross all of the German versions so that it doesn't matter for me which URL is getting actually indexed. The other approach is to make sure that these URLs are clearly seen as leading separate URLs. So making sure that the content is really significantly different across the different language versions so that our systems when they look at these pages say, well this is a this is one German page and this is a different German page there's an hreflang link between those two so we can swap out those URLs but they're unique pages they need to be indexed individually on their own. So those are kind of the two approaches there on the one hand you could fold things together on the other hand you could say, well I want these explicitly separate.

Question 37:00 - What would you advise if a brand has an international presence and once it shut down for certain regions. What do you do with the sites that are targeting to other countries should they redirect the main site?

Answer 37:14 - Essentially this is kind of a site migration situation I guess if you have one country site and you're saying well this site is no longer available I'll redirect it to another version of myself. That's something you can do. What you can't really do is specify in search to tell us that we should not show this site in particular countries. So you can kind of fold your German versions together and say like Germany Austria Switzerland they should all just be the one German website that's fine but I don't want to show my site to users in Switzerland perhaps but you can't. There's no meta tag where you can tell Google don't show my site in Switzerland. So at most what you can do is block users in that country from accessing the site but you can't tell Google not to show it in the search results for that country.

Question 38:18 - That was my question. Our website is in English and we have unique content with different countries. So because of a business decisions they're kind of shutting down certain regions. So if we do some redirecting for the site will it kind of impact my main website, will it be kind of negative or positive effect? I mean we currently have this hreflang on across all the subsets which have common pages where we can see like I mean we're offering one product in one specific country and the single product in other country with different content and all the region local content. So I’m wondering what will happen to those pages? if I redirect those pages to my main site, will my main pages rank for those specific regions?

Answer 39:19 - That could potentially happen but I again that's not something you can really control. So if you're redirecting those pages if those pages still exists then they might might rank in those locations as well you can't say that there's no meta tag or you could say well my website should be indexed but not shown in this particular the country.

Questions 39:50 -  Google has this you UX playbook for best practices to delight users for different niches. Are these considered part of ranking or can you give some insight on that?

Answer 40:02 - So those UX playbooks as for as I know are put out by the ads team and it's more a matter of, well these are UX best practices and good things to do on a website, and not necessarily something that we'd say, well these play into ranking, but obviously if you make a good website that works well for users then indirectly you can certainly see an effect in ranking. But it's not that we would say look at these UX playbooks and specifically use those as factors for ranking.

 

Question 40:38  - What's the impact of affiliate links when mixed with content targeting sensitive topics. We know that aggressive monetization can be problematic especially when affiliate links aren't disclosed to users and when the focus on monetization is more important than providing great content to users but I'm wondering how that would work for topics like health, medical, financial etc.?

Answer 41:05 - So in general this isn't as far as I know we don't explicitly go into the site and say, well there are links that look like affiliate links therefore we will treat this website as being lower quality. In general the main issues that I see with regards to affiliate websites is that they tend to just be lower quality website. So it's not so much a matter of where the check out flow is but rather that overall the content is often kind of low quality and because of the low quality content that's something that our algorithms might pick up on and say, well this is probably not the most relevant result to show in the search results, and there can be really high quality affiliate websites as well which are good. So it's not so much a matter of is there an affiliate link or not but rather like what about the rest of the website? Is this something that would be relevant to show to users or is there something perhaps problematic there? I think, at least as far as I know, that would apply across the board so it wouldn't really matter what what the specific topic area is of the website but in general there are some really good affiliate sites and there's some really really terrible affiliate sites. So it's more a matter of is a site good or is a site terrible?

 

Question 42:35 - With fetch and render being deprecated from the old search console and being replaced with the URL inspection tool will the UL inspection tool be updated to include side-by-side visual comparison of how Googlebot sees a page versus how user would see them?

Answer 42:53 - I don't know as as with the other question we try not to pre-announce things so that's not something where I'd be able to say anything specifically with regards to what will happen in the future. I'm also happy to pass this on to the team to take a look at to see with regards to prioritization. From a practical point of view I have always felt that this comparison is something that's less useful nowadays because most sites are pretty good at making sure that a page can be rendered for search engines. At least the the sites that I've looked at tend to be kind of on the good side there. So in particular in the beginning when sites weren't so used to search engines actually trying to render a page that was a big thing but nowadays I don't know if that's really such a such a big problem, ike if javascript is blocked by a robots.txt, those kind of things I feel that has changed significantly over the years but I am happy to pay it passes on to the team and they'll probably double check to see if there really are issues that we need to there.

 

Question 44:12 - Can link disavow boost rankings?

Answer 44:17 - Oh my gosh that's such a big question. So in theory if your website is demoted with a manual action with regards to links that you've bought or that someone else is bought for your website over time. Maybe a previous SEO, maybe something that was set up before your time with that company and then you can use the disavow tool to kind of remove those links from our systems and afterwards you can use the reconsideration request form to let us know about this change and then the manual webspam team will take a look and double check to see that the current status is okay. And if the current status is clean then they will resolve that manual action and in general your website will be able to rank naturally again. So that's something where in many cases the website might go up a little bit and ranking in some cases it might also be that these unnatural links were unnaturally supporting the website in the beginning. So by removing those links it might be that this support is gone which is kind of the natural state. So I guess the the shorter answer is there is no yes or no here in that links disavow does change what we see as links to your website and that can affect your website positively or negatively. So my recommendation generally with the disavow tool is use this if you have a manual action with regards to links and you can’t get those links removed or if you when you look at your links you realize, oh there was this crazy thing that we did maybe two years ago and the web spam team didn't notice but maybe it's time to clean up our act and clean that up, then that's something where it makes sense to use a disavow tool. I wouldn't just random to use this and say well these are weird links and I don't want to be associated with them I hope Google might rank my website better be when I remove those like so that's that's unlikely to happen.

 

Question 46:27 - John can I ask a question based on disavows here. So we see sometimes, I'm guessing maybe competitors,  we'll just see literally one place sent 1.1 or 1.2 links to our homepage. Completely off-topic, instrument repair site out of Ireland, nothing to do with what we do obviously. I don't have the tools or ability to disavow all the links is it best to then disavow the site?

Answer 47:01 - Sure you can do that. Yeah that's that's kind of what the the domain entry in the disavow file is for, that way it's like all of these millions or thousands of links you can just say everything from this site I just don't have anything to do with that. Usually what what happens in cases like that where you see this completely random site linking is that site probably got hacked and for whatever reason someone decided to drop all those links there when hacking the web site and usually we're pretty good at picking that up and trying to keep the state from before they hack in our systems. So probably we're already ignoring those links, it's possible that maybe or even indexing the website from before rather than with the hacked content. So that's something where usually I wouldn't worry about that too much because these kind of hacks are really common and we have a bit of practice trying to deal with that. But if you're worried about this you know like oh this is like really crazy and I think instrument repair links probably aren't something where you're like saying, oh this like will kill my website if I'm associated with violins, maybe adult content is something where you'd be more cautious about then I would just use a disavow file submit that and you're sure that these are not taken into account.

 

Question 48:44 -  Two quick questions regarding this URL it's a category URL for an e-commerce website that has a kind of a view-all parameter basically our issue is that URL has a canonical to the version without the parameter. It's linked with the version without the parameter throughout the website it's in the sitemap without the parameter but somehow Google decides no this is the canonical version and all of our signals are pointing to the other version and I don't think it acts as much but I don't understand why doesn't Google pick our version when all of our signals are pointing into the same direction?

Question 48:24 - I don't know, hard to say offhand. So I think one thing to to kind of keep in mind is that it's under mobile first indexing. So perhaps the mobile version is slightly different maybe there's a rel canonical there maybe there's something with JavaScript that's changing the rel canonical I don't know but that's always something to kind of keep in mind. So when you're testing it in the browser and make sure you double check the mobile view but kind of as as both the the other questions earlier it's also just possible that for whatever other reasons we're deciding well actually this is is the cleaner version.

Question 50:33 - If it's not linked from anywhere and you know the internal links are not pointing to it. It's not gonna sitemap and it with the URL inspection to Google does see the right the correct canonical that we said it just the Google selected canonical it's this one still rather than what we we tagged. So I don't know if it would have any effect, I'm just worried that there might be something that we're missing that might affect other pages as well.

Answer  51:11 - Yeah I don't know I I'd have to take a look. In general if it's the same content then the ranking would be the same. So it's not that anything will change there. I don't know if the if bribing me with haircare products would be particularly useful but I don't know. So I mean off hand it's it's a situation where you have kind of a view-all page and the the other version of the page where it might be that we're picking up some signals saying well this is the view all version we should keep that one instead of the other one because it might be paginated or something like that but yeah I don't know if and I I'd have to dig into that. The other thing maybe worth mentioning is we we'd like to put together some document with regards to best practices around ecommerce sites in general. So if you run across weird things like this we're using well with e-commerce sites you often have these category pages that have tons of content and the pagination and the filtering and the view all things are kind of tricky then that's something where I'd love to get feedback on. So maybe examples maybe just general questions with regards to e-commerce sites all of that would be really useful to have. Ideally maybe send those via Twitter so that I can collect those there I started a tweet on that and hopefully we can put something together with some, I guess best practices with regards to e-commerce sites on how you can make sure that indexing and crawling works ideally.