John is back! Here are some of the best questions asked and answers from the recent Webmaster Helpl Hangout. This week John answered some great questions regarding 302 Redirects, Page Speed, and best practices for merging subdomains. Full video and transcript at the end!

Can you block one part of a page, like a video from Google?

1:16

John Mueller February 6 Help Hangout

I think that's kind of tricky because there is no explicit way to say I don't want my page to be shown like this in the search results. So what I kind of tend towards is trying the expiration date approach which I believe you can also specify in structured data on the page.  I think so yeah with json-ld structure data there's a way to specify the expiration date as well for a video that we picked up on a page I guess. So that that might be something I try.

 


Summary: You can’t really ask Google to index just part of a page, or only include the page in certain SERP features. However, you can use structured data to add an expiration date to a video which should make it not appear in search results.

Here is more information from Google on video schema and also the use of expiration dates.


What does it mean if pages from your site are getting noindexed and GSC shows, “Crawled, not indexed”?

5:23

John Mueller February 6 Help Hangout

So it's really hard to say without looking at specific examples from your website so what I would recommend doing here is posting the what amounts to help forum with the exact URLs that you're looking at the queries that you're looking at so that folks there can take a look. Sometimes it's something simple something technical such as maybe a blog is set up in a way that is no indexing these pages by default but usually the the folks in the webmaster forum are very quick and recognizing these kind of common issues and can help you to narrow things down or to escalate them if if there's really something completely weird that's happening here. Another thing to keep in mind is we don't index all pages that we've seen it's really common for websites to you have lots of pages on them that we know about but we don't necessarily crawl an index and that's not necessarily a problem. That's just our algorithms trying to figure out where it makes sense to focus our energy and where they since maybe focus a little bit less energy and usually that also means that those pages are unlikely to be that visible in search anyway so it wouldn't change that much for your site overall if they were indexed but again I would definitely check in with the webmaster help forum to see what what could be like some of the possible causes there.


Summary: It could be due to a technical issue causing the pages to be noindexed. Also remember that Google doesn’t always index every page that you publish. Our note: Be sure that you are not publishing blog content for the sake of just having content. It has a much higher chance of getting indexed if it is super valuable!


Do you need to add spammy links to your disavow file?

8:17

John Mueller February 6 Help Hangout

So in general you don't need to keep on top of this it's something where pretty much any website if you look at the inbound links will will have a bunch of links that are just kind of spammy or irrelevant and that's perfectly fine. We are pretty good at ignoring a lot of the kind of cruft that web sites collect over the years. I wouldn't focus on this.If it's something where you know that previously you went out and bought links or you had an SEO do something really weird with regards to links then that would make sense to clean up but if you're not aware of anything and things are otherwise kind of okay then I wouldn't worry about this.


Summary: Every website attracts spammy links. Google is really good at ignoring these. The types of links you should be disavowing are ones that were purposely made for SEO.


How does Google handle websites that require location information from a user in order to improve UX?

9:29

John Mueller February 6 Help Hangout

So I assume with with location finder you mean something where your website would recognize where the user is located and maybe show the local phone number or local address based on the user's location. That's not something that we would do with regards to search.Sso that's generally not something we even pick up on. In general when Google renders pages it also denies any of these kind of additional information requests so that would be something where we probably wouldn't even notice that.


Summary: Googlebot does not pay attention to location requests. Our note: Make sure that your important page information is not reliant on receiving location information from a user.


Does a 302 redirect pass PageRank?

10:10

John Mueller February 6 Help Hangout

A 301 is a permanent redirect which tells us that a new page is replacing an existing page forever and a 302 redirect tells us that temporarily the content is available on a different URL. From a practical point of view what generally tends to happen here is with a 301 redirect we'll focus on the destination page we'll index that URL and move all the signals there. Whereas with a 302 redirect we'll focus on the initial page and try to keep all of the signals there. So it's not a matter of pagerank passing or not but rather which of these URLs is actually the one that keeps the signals. It's not that these signals get lost with the redirect it's more like, are they here or are they here, and it that's kind of the main difference.

So if you're tracking which of these pages is ranking the redirect target or the redirecting the kind of the initial page that is doing the redirecting. Then with a 302 you probably tend to see the initial page ranking because that's the one that we pick because you're kind of telling us it's just temporary and the content is somewhere else and whereas with a 301 redirect we probably tend to rank the destination page more. And as always people get this wrong regularly and our algorithms try to figure out what it is that people are trying to do. So if we see a 302 redirect in place for a longer period of time we might describe well probably the webmaster meant this to be a 301 redirect and we'll treat it as such and we'll shift all of our signals over to the destination page. So that's something where you might see those those kind of changes happening over time as well so it's not that link equity or like our signals flow through a redirect or not but rather which of these URLs do we end up picking for our indexing and that's the URL that ends up getting all these signals.


Summary: A 302 redirect is supposed to be a temporary redirect, while 301 is permanent. With a 302, Google tries to continue to focus on the original page rather than the page to which the user is redirected. However, if a 302 is in place for long enough, it gets treated as a 301. Our note: As such, eventually, a 302 should start to pass PageRank. But, we don’t know how long that takes.


Can unflattering or negative language on a page cause it to be treated as lower quality by Google?

13:02

John Mueller February 6 Help Hangout

I  had a quick question as you know we've been dealing for years with all kinds of strange ranking issues. One of my partners when he was dealing with some YouTube issues had a potential theory that based on like disallowed words for instance. Sometime we would refer to a car salesman as a moron or something like that and in some of our articles. Basically we wanted to find out could there potentially be an issue with using language like that on certain pages on the site. I mean nothing like profanity but just kind of like insult type stuff like that?  We still see the phenomenon where other websites that have basically stolen our content and then slightly modified things, rank where we used to and then we rank kind of bad. When they modify the content after they steal it they don't tend to steal those pages on the site that that have that type of of language usage.


Summary: No. Our note: This doesn’t mean you should ignore all UGC. Google has said in the past that spammy UGC can be seen as a sign of low quality.


Should rel-prev/next be used for “related articles”?

17:41

John Mueller February 6 Help Hangout

No. So the the kind of the rel next linking is really just for pagination series it's not something that's meant for kind of related links. So I would just cross link those related articles normally usually that's that's what these plugins tend to do.

 

 


Summary: No. This is only for paginated pages.


How long should it take for a slow site to see better rankings after making page speed improvements?

24:32

John Mueller February 6 Help Hangout

So as with pretty much anything related to to web search it's not something where there's a fixed time frame involved but rather we we crawl in index pages over time. We update kind of these signals that we have for these pages over time. There's no fixed timeline some of these pages and signals get updated every day or even more frequently. Some of them take a little bit longer, some of them take months to get updated. So what you’ll probably see here if you make significant improvements within your website but you'll see over time this kind of gradual rise with regards to us taking those signals into account. It might be very tricky when it comes to speed in the sense that speed is not the most important ranking factor. We do look at things like content and try to figure out which of these pages are most relevant to users as well. So if a site is really fast that doesn't mean that it's always ranking number one theoretically an empty page would kind of be the fastest page but that doesn't mean it's a good result for user so speed is more of a smaller ranking factor there. So that's something we'll probably you'd see bigger changes in your site's visibility in search over time based on improvements in the quality and improvements of the website overall and speed is more like something small that would probably even be hard to measure individually.


Summary: Google updates their signals about Page Speed for a page over time. It’s not like we have to wait for a Page Speed update. If speed is holding you back from ranking well, and you make improvements, you should see a gradual increase in rankings.


What does it mean if you are seeing pages that are blocked by robots.txt in GSC?

31:33

John Mueller February 6 Help Hangout

I don't know we haven't announced that yet but we're trying to be a little bit ahead of turning things down so that people have a chance to move to something new. To move the new tools in search console. So as soon as we we have more plans on what is happening there we will let you know. I think this is one of those tools that that make sense to keep. So since we haven't announced that we're turning it off I imagine it'll just come with the new switch to console over time.

 


Summary: Most likely, yes. But not just yet.


When merging several subdomains into a large site, should it be done in phases?

35:47

John Mueller February 6 Help Hangout

I will just try to go with the final state as quickly as possible. So instead of creating this temporary situation where things are neither the old one or the new one,  I would try to just redirect to the new ones as quickly as possible.

 

 


Summary: It’s best to do it all in one go.


Are links important when it comes to ranking videos in regular search?

39:08

John Mueller February 6 Help Hangout

So we use a number of factors when it comes to ranking and that does include links. So it's something where I would be wrong to say like we we don't use links at all. However we do use like I don't know over 200 factors for crawling indexing and ranking so focusing on links alone doesn't really make sense. So that's something where good content traditionally picks up links on its own and if you're creating video content on your website and all of that is generally also interlinked within your website so over time these things kind of settle down on their own. It's not that you need to explicitly build links so that you can show up in a video carousel  I think that would make a sense.


Summary: Most likely, yes. But there are many other factors as well.


Are navigation links that are hidden on mobile ok?

40:06

John Mueller February 6 Help Hangout

We would really follow it was normally so I don't see any problem in that. That said if your website is hard to navigate on mobile then users who are for large majority of the sites are mostly coming on mobile will have a hard time navigating your website and will have a hard time kind of finding your other good content. So I would certainly make sure that any UI that you have available on your website is also available in some form or another for mobile users.

 


Summary: These are generally ok. However, make sure that your navigation is intuitive to users.

Our note: If you have important pages that you are trying to link to internally, we don’t like to rely just on navigation links. We really feel that text anchored links from within the main body of content on your pages are more helpful.

Google employee Zineb Ait once said this:

https://twitter.com/Missiz_Z/status/691687094444539905
(Translation: “The links sitewide in footer or header do not have a very big weight in general.”)


What does it mean when Google rewrites your title tag or meta description?

40:06

John Mueller February 6 Help Hangout

We do have guidelines for how to make good titles and good descriptions so I'd recommend taking a look at that. Oftentimes when I see Google rewriting the titles or the snippet that's usually more based on situations where we see kind of almost like keyword stuffing happening with the titles or the description. So that might be one thing to kind of watch out for another thing to keep in mind is that we do try to pick titles and descriptions based on the query. So if you're doing a site query and you're seeing your titles and descriptions in one way that doesn't necessarily mean that they'll be shown the same way when normal users search with normal queries. So I kind of take a look at both of those and also definitely make sure to check out the Help Center article.

 


Summary: Sometimes Google will rewrite these if it looks like you are keyword stuffing. They can also be rewritten if Google thinks they can rewrite it to better answer the user’s query.


If you like stuff like this, you'll love my newsletter!

My team and I report every week on the latest Google algorithm updates, news, and SEO tips.

Full video and transcript

Question 1:16 - I have a website that I manage so recently last year in June it changed from regular web listing to video carousel, one of the queries, like stock footage or stock videos that kind of caused huge drop in clicks as a result. It was kind of misleading with video thumbnail because that result doesn't really match the intent of the user. As users are mistaking the video carousel as a YouTube video. So in order to kind of block it we took some actions we blocked the video using the robots.txt, we also kind of removed the videos completely from the page. Things actually got worse instead of picking up the video Google started showing the image thumbnail in the video carousel with a timestamp of 15 seconds. So that ultimately led to a very poor user experience as the video was nowhere to be found the users saw the video on the search page and once they landed on the page there was no video and that ultimately kind of led to a slight drop in rankings as well. I know you tweeted out last year that we should try adding a sitemap for the videos and setting expiration date in the past and the other one was blocking that part of the page completely, so that's the one we tried. So the problem with that thumbnail URL is they're delivered via CDN so we don't really control the URLs so we can't really brought them so any any suggestions there would be would be very helpful.

Answer 3:10 -  I think that's kind of tricky because there is no explicit way to say I don't want my page to be shown like this in the search results. So what I kind of tend towards is trying the expiration date approach which I believe you can also specify in structured data on the page.  I think so yeah with json-ld structure data there's a way to specify the expiration date as well for a video that we picked up on a page I guess. So that that might be something I try.

Question 5:23 - Nearly all of the articles in our blog section have been de-indexed from looking at search console it says crawl not indexed and it's showing no specific reason why what could be happening there?

Answer 5:38 - So it's really hard to say without looking at specific examples from your website so what I would recommend doing here is posting the what amounts to help forum with the exact URLs that you're looking at the queries that you're looking at so that folks there can take a look. Sometimes it's something simple something technical such as maybe a blog is set up in a way that is no indexing these pages by default but usually the the folks in the webmaster forum are very quick and recognizing these kind of common issues and can help you to narrow things down or to escalate them if if there's really something completely weird that's happening here. Another thing to keep in mind is we don't index all pages that we've seen it's really common for websites to you have lots of pages on them that we know about but we don't necessarily crawl an index and that's not necessarily a problem. That's just our algorithms trying to figure out where it makes sense to focus our energy and where they since maybe focus a little bit less energy and usually that also means that those pages are unlikely to be that visible in search anyway so it wouldn't change that much for your site overall if they were indexed but again I would definitely check in with the webmaster help forum to see what what could be like some of the possible causes there.

Question 7:05 - We added a bunch of pages to our site but forgot to update the sitemap file so Google found them and index them but do we need to do anything special

Answer 7:16 -  No you don't need to do anything special a sitemap file helps us to kind of add more information with regards to crawling and indexing it doesn't replace the information that we have for crawling and indexing. So if it's just a matter of finding there its URLs if we've found them through normal crawling that's perfectly fine. A sitemap file also shouldn't replace normal crawling anyway so if none of these pages were linked internally we might be able to find them through a sitemap file but it would be really hard for us to understand the context of those pages. So it's actually a good sign in that regard that we're able to find those pages and index them anyway. A sitemap file would help us here to recognize when you make changes on these pages so we can pick them up a little bit faster but if we're already indexing them then at least that first step isn't something that you're critically missing.

Question 8:17 - I checked our inbound links and found there to be lots of spammy links which I added to our disavow file I wonder how important it is for ranking to keep on top of this I heard Google ignores them anyway?

Answer 8:34 - So in general you don't need to keep on top of this it's something where pretty much any website if you look at the inbound links will will have a bunch of links that are just kind of spammy or irrelevant and that's perfectly fine. We are pretty good at ignoring a lot of the kind of cruft that web sites collect over the years. I wouldn't focus on this.If it's something where you know that previously you went out and bought links or you had an SEO do something really weird with regards to links then that would make sense to clean up but if you're not aware of anything and things are otherwise kind of okay then I wouldn't worry about this.

Question 9:20 - Does Google pick on whether we use location finder on a mobile version to give a better UX can this help with rankings?

Answer 9:29 -  So I assume with with location finder you mean something where your website would recognize where the user is located and maybe show the local phone number or local address based on the user's location. That's not something that we would do with regards to search.Sso that's generally not something we even pick up on. In general when Google renders pages it also denies any of these kind of additional information requests so that would be something where we probably wouldn't even notice that.

Question 10:10 - Does link equity flow through a 302 temporary redirect?

Answer 10:21 - A 301 is a permanent redirect which tells us that a new page is replacing an existing page forever and a 302 redirect tells us that temporarily the content is available on a different URL. From a practical point of view what generally tends to happen here is with a 301 redirect we'll focus on the destination page we'll index that URL and move all the signals there. Whereas with a 302 redirect we'll focus on the initial page and try to keep all of the signals there. So it's not a matter of pagerank passing or not but rather which of these URLs is actually the one that keeps the signals. It's not that these signals get lost with the redirect it's more like, are they here or are they here, and it that's kind of the main difference.

So if you're tracking which of these pages is ranking the redirect target or the redirecting the kind of the initial page that is doing the redirecting. Then with a 302 you probably tend to see the initial page ranking because that's the one that we pick because you're kind of telling us it's just temporary and the content is somewhere else and whereas with a 301 redirect we probably tend to rank the destination page more. And as always people get this wrong regularly and our algorithms try to figure out what it is that people are trying to do. So if we see a 302 redirect in place for a longer period of time we might describe well probably the webmaster meant this to be a 301 redirect and we'll treat it as such and we'll shift all of our signals over to the destination page. So that's something where you might see those those kind of changes happening over time as well so it's not that link equity or like our signals flow through a redirect or not but rather which of these URLs do we end up picking for our indexing and that's the URL that ends up getting all these signals.

Question 13:02 - I  had a quick question as you know we've been dealing for years with all kinds of strange ranking issues. One of my partners when he was dealing with some YouTube issues had a potential theory that based on like disallowed words for instance. Sometime we would refer to a car salesman as a moron or something like that and in some of our articles. Basically we wanted to find out could there potentially be an issue with using language like that on certain pages on the site. I mean nothing like profanity but just kind of like insult type stuff like that?  We still see the phenomenon where other websites that have basically stolen our content and then slightly modified things, rank where we used to and then we rank kind of bad. When they modify the content after they steal it they don't tend to steal those pages on the site that that have that type of of language usage.

Answer 14:49 -  I don't think that would affect anything. Some pages have user-generated content on them and they use this kind of informal language as well so it's like I think that's perfectly fine.

Question 17:41 - We added related articles to the bottom of our articles blog page. We make the link in the rel next point to one of these related articles.

Answer 17:52 -  No. So the the kind of the rel next linking is really just for pagination series it's not something that's meant for kind of related links. So I would just cross link those related articles normally usually that's that's what these plugins tend to do.

Question 19:42 - Speed and security are becoming more important factors now do you plan to use security errors in DNS SEC and the usage of CDN as a ranking signal apart from the data for speed

Answer 19:55 -  I don't know if it would make sense to go this I think into individual kind of elements that make up speed but I could definitely see that that we use, I mean, we do use speed as a ranking factor. So if all of these elements play into speed make your website faster for users and probably that's something that would be working for you there. But I don't think it would be in a case in which we'd say this specific technology is something that you must use because it's an actual ranking factor. It's more that we say well, speed is important how you achieve that speed is ultimately up to you.  Maybe you use these technologies maybe use other technologies. There are really cool fancy in new ways that you can make a website really fast and how you would do that is ultimately up to you because from a user point of view they don't care what technology you use as long as the page comes up very quickly.

Question 20:59 -  We use web.dev and achieved 100 out of hundred for SEO performance best practice and accessibility however we didn't see any ranking improvement. How long does it take?

Answer 20:12 -  So web.dev is a great way to test your site for a lot of known issues and to compare against known best practices but just achieving good results there doesn't mean that your website will automatically jump up in rankings above everyone else. So that's something kind of to keep in mind there this is not the kind of final and ultimate way of doing SEO and ranking well. It's a list of best practices and it gives you various things that we can test that we can flag for you. So that's something where I think it's a good idea to look at these things but you you need to be able to interpret what comes out of them and you need to realize that there's more to ranking number one than just kind of fulfilling a set of technical requirements.

Question 22:26 - Recently I'm researching a way to serve static content with serverless cloud infrastructure produced by a dynamic CMS, lots of fancy words, so we can keep the great and easy to use back in a WordPress that serve the content and static HTML for users which will dramatically increase performance and security. As Google and Automatic are partnering now and WordPress is powering a lot of the web do you think there might be a non app way of doing this instead of geeking out?

Answer 22:33- I don't know of any specific way to set that up with what you're looking at there but in general our testing tools work for any kind of web content. So if you can serve your content using whatever infrastructure that you you think makes sense for your site whatever back-end you think makes sense for your website and you can use our testing tools to confirm that Googlebot is able to see that content then that should work out. So it's not that Googlebot would specifically be saying you need to use this infrastructure and do it like this, so that we can do it, but rather you can use whatever infrastructure you want and as long as Googlebot can get to that content then you should be all set.

Question 24:32  - You mentioned in previous hangouts that the latest round of performance related changes do is open gradual rankings penalty as your website gets incremental slower we're working on improving our site speed. Do you know how long it might take for Google to notice these improvements once they're out?

Answer 24:54 - So as with pretty much anything related to to web search it's not something where there's a fixed time frame involved but rather we we crawl in index pages over time. We update kind of these signals that we have for these pages over time. There's no fixed timeline some of these pages and signals get updated every day or even more frequently. Some of them take a little bit longer, some of them take months to get updated. So what you’ll probably see here if you make significant improvements within your website but you'll see over time this kind of gradual rise with regards to us taking those signals into account. It might be very tricky when it comes to speed in the sense that speed is not the most important ranking factor. We do look at things like content and try to figure out which of these pages are most relevant to users as well. So if a site is really fast that doesn't mean that it's always ranking number one theoretically an empty page would kind of be the fastest page but that doesn't mean it's a good result for user so speed is more of a smaller ranking factor there. So that's something we'll probably you'd see bigger changes in your site's visibility in search over time based on improvements in the quality and improvements of the website overall and speed is more like something small that would probably even be hard to measure individually.

Does 27:33 - Does the algorithm possibly associate negative factors with either people or organizations and then downgrade unrelated websites that are associated with the same people?

Basically like we've had the issues with car buying tips and then one of my partners and I have a hobby type site related to stone crab fishing when we put it up it was ranking in the top three on anything related and then it kind of disappeared from the rankings. We posted a question on the webmaster help forum and within like minutes the top contributors all started focusing on the fact that the site had associations with car buying tips and so I wanted to see if it's possible that there's some kind of a negative connotation somehow attached to us personally.

Answer 28:27 - I can't imagine that there's something attached to you personally where our algorithms would say, oh man this guy again. So usually where that comes in is if we see a website and it's kind of well interlinked set of sites that are all problematic then that might be something or algorithm goes oh we got to be careful here all of these websites are problematic so maybe this new website that's also part of this set is also kind of tricky. But if it's a matter of these websites just being on the same server with the same owner that's that's usually not a problem. It's also generally more of a problem when it goes into the direction of doorway pages where maybe you're creating a new website for hundreds of different cities across the country and essentially all these pages are the same like all of the websites are the same and that's something where our algorithms might say, well this doesn't look like a lot of value for for us.

Question 31:33 - Basically what happens these are pages that are the result of a user on our site selecting their biology things right so they select a configuration of a gene and reagent that goes along with it and what happens is that we create a search page effectively, it's a filtering process, we create a search page and then of course it refers to a product page and they click on that those referral those those filtered pages end up periodically in Google Webmaster Tools and you know there's not just a couple. It happens thousands at a time and so one of the things we were wondering is we do have it in our robots.txt file and we believe it's correct is there do we need to do I'll call it belt-and-suspenders and also make those a no index you know command at the beginning of the page or something like that that would help that? It just it makes it difficult to use webmaster tools sometimes when it's full of that kind of information. Finding the needle of the things we really have to take care of is difficult by the sometimes over-enthusiastic webmasters tool of capturing everything we do.

We'll see I'm in either mostly in the performance the new performance tool and it will be you know an anomaly or soft 404 one of those two because it tries to go back and find it and of course you can't find it.

Answer 33:16 - So noindex would be an option here as well but then you would have to take it out of the robots.txt so that we can see the he no index. I wonder if that's already happening to some extent here because we we wouldn't be flagging it as a soft 404 in search console if it were completely blocked by robots. So that's something where maybe we're already able to kind of crawl those pages and then we say oh we probably don't need to index these therefore we'll let the webmaster know that we kind of stumbled upon them. So that might be something to double-check that they're actually blocked by robots.txt.

Question - We're pretty sure it also happens and then they disappear and then it happens and disappear. So maybe it's just a matter of that the processing time in between the two. We just they don't, I mean, they're not really pages so we don't want them in the index because they have no title, no h1, none of that stuff because they're really not pages. They're just the results of a user asking to configure a gene and one of our products.

Answer 34:40 - Yeah so in in that case I would just leave them in the robots.txt leave them blocked. I think that's perfectly fine. There's no real way to kind of block them from appearing at all in search console but I think having them the robots.txt file is perfectly fine.

Question 35:00 - By the way when are they gonna put the robots.txt tester tool in the new Webmaster Tools?

Answer 35:06 - I don't know we  haven't announced that yet but we're trying to be a little bit ahead of  turning things down so that people have a chance to move to something new. To move the new tools in search console. So as soon as we we have more plans on what is happening there we will let you know. I think this is one of those tools that that make sense to keep. So since we haven't announced that we're turning it off I imagine it'll just come with the new switch to console over time.

Question 35:47 - I have a client going through a site migration it's a large enterprise website where they're going from multiple subdomains to one and right now individual business pages and content is across several branded subdomains. Each business has the same template with you know one main page and several sub pages and the single domain experience will consolidate that business template down to maybe one or two to three pages there's a lot of content consolidation. They'll be doing this in in phases you know and it'll still be a large-scale but since we're talking about so many multiple page experiences going down to one potentially. For those to be retired or redirected business business pages on old sub domain I've been thinking they should intentionally orphan those first before pushing through so many redirects at once to the new consolidated experience. Almost to let the dust settle but is it the right approach or is it just best to redirect those old pages the new relevant and compact experience and then let the dust settle from there?

Answer 36:46 - I will just try to go with the final state as quickly as possible. So instead of creating this temporary situation where things are neither the old one or the new one,  I would try to just redirect to the new ones as quickly as possible.

Question 37:49 - If it's dynamic surveying and a site owner wants to set up an amp page also with the same URL and a desktop page does he  need to add an amp HTML page so the same URL for mobile desktop and amp?

Answer 38:00 - So I think first of all you wouldn't be able to use the same URL for mobile and amp if you're serving different HTML because the same user would be going through that page and you wouldn't know which content to serve so that I think wouldn't work. However you can of course make an AMP page and just say the amp page is my normal page that's a perfectly fine set up. For example in the new WordPress AMP plug-in I believe there's an option I don't know it's called native amp I think. Where basically your website is purely an amp page and that's that's a perfectly fine setup. So in a case like that you would I believe set the amp HTML tag to the same URL so that we know this is meant to be the amp page you would also set the canonical to the same URL so that we know this is the canonical that you want to have indexed and then we'd be able to pick that up.

Question 39:08 - Do backlinks help in the rankings of videos in the Google Search carousel?

Answer 39:13 - So we use a number of factors when it comes to ranking and that does include links. So it's something where I would be wrong to say like we we don't use links at all. However we do use like I don't know over 200 factors for crawling indexing and ranking so focusing on links alone doesn't really make sense. So that's something where good content traditionally picks up links on its own and if you're creating video content on your website and all of that is generally also interlinked within your website so over time these things kind of settle down on their own. It's not that you need to explicitly build links so that you can show up in a video carousel  I think that would make a sense.

Question 40:06 - How does Google treat site wide navigational links that are hidden on mobile resolution in responsive pages but visible on desktop.

Answer 40:13 - We would really follow it was normally so I don't see any problem in that. That said if your website is hard to navigate on mobile then users who are for large majority of the sites are mostly coming on mobile will have a hard time navigating your website and will have a hard time kind of finding your other good content. So I would certainly make sure that any UI that you have available on your website is also available in some form or another for mobile users.

Question 40:48 - Recently we learned that many of our pages have not been shown in Safe Search Results these pages are mostly destination pages of cities and countries. I've been keeping track of some of the keywords for example: Gay Barcelona or generally gay destinations anywhere. In the past five days of Barcelona index page showed up in first position on safesearch but then has disappeared from Safe Search now. We always make sure that there's no explicit image or profanity in the content but it hasn't guaranteed her position and safe search results. Could you explain how the safe search algorithm actually works?

Answer 41:35 -  So we use a number of factors in figuring out when to show which content to to which users in the search results. I don't think there's like this one simple thing that makes safe search work or not. I suspect with a website like yours it'll always be kind of tricky for algorithms to figure out watch what exactly we should be showing here and how we should be showing that in the search results. I think I passed your website on to the the team here wants to take a look at it as well. So like I can definitely double check with them but I imagine it'll always be kind of tricky and a bit borderline for our safe search algorithms to figure out how we should be handling this kind of website. Which is always I think a little bit unfortunate but it's hard to find exactly the the right balance there.

Question 42:37 -  I have two websites that offer very similar content some of it is even duplicated but only one will verify for Google News while the other one won't the noticeable difference is one has a health and fitness section while the other one focuses more on lifestyle content and celebrities, why why would one be accepted into Google News and not the other one?

ANswer 43:43 -  I don't know why why that might be happening I don't know the the Google News policies specifically in that regard so it's really hard to say. In general though if these websites are so similar that you're saying some of the content is even duplicated maybe it makes sense to just focus on one website rather than to have two websites that are kind of essentially duplicate or very similar targeting same audience but that's I think more of a question in general for you to kind of consider. With regards to Google News specifically I would recommend going through the Google News publisher forum and double check with the folks there. The experts that that are in the forum there have a lot of experience with sites that are accepted to Google News and sites that get improved so that they do get accepted and sites that wouldn't get accepted to Google News. So they can probably give you some tips with regards to what to watch out for specifically for Google News.

Question 44:08 -  Hotel website and even if you search for it with its exact name which is unique I can't see it in the first page of search results. My Google my business listing has been punished for two weeks this listing was closed by admins and I sent the documentation and they understood was mistake and we opened it. I suspect this is the reason why my web site can't be found.

Answer 44:38 - So I don't know about the specific case here so that's really hard to say but in general just because of web site isn't in Google my business wouldn't mean that we wouldn't show it in web search results for the most part the the web search results are independent of the Google my business listings. Obviously if it is in Google my business and we show it in that map's listing then that would be one place where your website would be visible but just because it's not in a Maps listing doesn't prevent it from appearing in the normal search listings. So my suspicion is that there's probably something else that you could be focusing on or that you could be looking at there and like in some of the other cases I'd recommend going to the webmaster help forum and getting some input from other people who've seen a lot of these cases and might be able to help you figure out what you could be doing there to improve.

Question 45:42 -  I noticed Google rewrites some titles and meta descriptions any idea of how to know if Google will rewrite the content or keep the original version?

Answer  45:53 - We do have guidelines for how to make good titles and good descriptions so I'd recommend taking a look at that. Oftentimes when I see Google rewriting the titles or the snippet that's usually more based on situations where we see kind of almost like keyword stuffing happening with the titles or the description. So that might be one thing to kind of watch out for another thing to keep in mind is that we do try to pick titles and descriptions based on the query. So if you're doing a site query and you're seeing your titles and descriptions in one way that doesn't necessarily mean that they'll be shown the same way when normal users search with normal queries. So I kind of take a look at both of those and also definitely make sure to check out the Help Center article.

Question 46:46 - How does Googlebot view website personalization? We have a new product that layers the website content to allow personalization based on industry location or even to a single company. This allows us to really bespoke content to individual end users. My concern is I'm showing the original on-page content to Googlebot and personalized content to end-users will this affect our clients negatively?

Answer 47:14 - Maybe. Maybe it will. So the thing to keep in mind is Googlebot indexes the content that Googlebot sees. So if you have something unique that you're showing individual users and Googlebot never sees that then we wouldn't be able to index that. We wouldn't be able to show that website in search for those queries. So for example if you have a if you're I don't know if you're recognizing that a user is from the US and you show English content you show a user from France French content and Google crawls from the US then Googlebot will only see the English content and will never know that there's actually French content on this website as well because it would never be able to see that content. So that's something kind of to keep in mind here, if you're just doing septal personalization and like maybe adding related products or maybe adding in additional information to the primary content on your page based on location or other attributes then we'd be able to rank the page based on the primary content that we can see but we still wouldn't know what kind of this additional layer of information is that you're adding through those pages. So it's not a matter of us kind of like penalizing a website for doing or causing it any problems it's more practical thing we can't see it so we don't know how we should rank that.

Question 49:25 - Two small questions there my hotel website has a domain name.com.tr  and therefore Google automatically makes its region Turkey in Google Webmaster Tools. I can't change it but my customers are mostly people from UK for example and I want to be listed if searched from UK do you think that changing the domain name is needed if I want to be listed for people from Europe?

Answer 49:50 - You could do that. So when a user from a certain country is searching for something local we'll use geo-targeting to try to highlight those pages for those users. So if a user in the UK is searching for a hotel in Turkey then there's no need to do geo targeting because the website in the UK would not be relevant for a user is searching for something explicitly in Turkey. So I think for the most part you wouldn't need to use geo-targeting there. I think that's perfectly fine a website that where we recognize Turkey as the country can still be relevant globally. So unless you're like let's say your a pizzeria and you offer special Turkish pizza and you deliver it to users in London then a user in London searching for pizza in London then your Turkish website would have a problem but if a user in London is saying I want to search for hotels in Turkey then on this your website is perfectly fine.

Question 52:01 - General question for a lot of publishers. You'd be surprised how many people create like a ton of useless tags and this was something that we realized a few years ago like somehow we had thousands of tags. We worked really hard and we got down to a few hundred tags and we were very very happy with how we did that and now I see even after that a few of the things that we bucketed and did for 301 redirects. Now I’m seeing after that things that we don't cover for instance celebrity crime anymore, celebrity divorces, we just do the fact-checking what happens then when you've got all these redirects to this one sort of tag which is no longer that applicable, does it just die out or do you think of a or redirect that some other way because I can't really think of a way to redirect let's say celebrity divorce or health anymore to something and yet we have like let's say a hundred stories to that tag at this point.

Answer 53:11 - I think that's kind of something I can naturally evolve over time. So if you start having more content there and you want to kind of revive that tag that's that's perfectly fine. If you want to kind of combine things even more that's that's perfectly fine as well. These kind of almost category pages they it kind of evolved over time I think that's normal.

Question - But what happens if that category is just just no longer useful anymore it do it just but that it just sit forever and I mean it's fine but I'd like to obviously consolidate as much as I can.

Answer 53:49 -  I think consolidating is fine another option where you might say is maybe even know Index makes sense like we don't really want to be indexed for this content anymore we want to keep it on our side if people know that it's there and they explicitly search for it within your website but maybe no index would would kind of like help Google or other search engines to focus more on the indexable content.