February 22, 2019 – Google Help Hangout Notes

Jon Mueller twice in one week!? In this hangout John had some great insight on redirects, AMP, Algorithms and information for ecommerce sites. You can find the full video and transcript below. We also collect and curate all of the best SEO articles of the past week and sum them up in quick and easy summaries in our incredible weekly newsletter!

How do you decide when to use a 301 redirect vs a 302?

4:45

John Mueller Help Hangout February 22 2019Some e-commerce sites tend to redirect their popular search term pages to more curated landing pages like a search query to /video-games/xbox, for example. Is that supposed to be a 301 or a 302? What if they want to redirect the search term to another page during different periods?

 

 

 

 

Summary: If the redirect is likely to be permanent, it should be a 301.

 

Interesting thoughts on the use of text on product pages

7:50

John Mueller Help Hangout February 22 2019

This is something that comes up fairly regularly. One of the reasons why websites initially started kind of doing this kind of workaround is that it was really hard sometimes for us to rank category pages on e-commerce sites if there is no useful information on the page or no context on the page. As a workaround, people started stuffing whole wikipedia articles below the fold using a small font, sometimes using a link that says “more information” and pops up with a giant article of text. From our point of view, that is essentially keyword stuffing. That is something which I would try to avoid. I try to stick to really informative content and put that in a place where you think users would be able to see it, especially if it is content that you would like to provide for users.

 

More than that, I would think about what you can do to make those pages rank well without having to put a giant paragraph of content below the page. Things you could do here — kind of make sure that those pages are well integrated with your websites so that we have clear context of how those pages should belong in the website and what those websites are about. Another thing you can do, when you have that listing of products, is make sure there is some information on those listings that we can understand what it is about. Instead of just listing 40 photos of your product, put some text on it. Make sure you have alt text for the images and that you have captions below the images so that when we look at this page, we understand, “There is this big heading at the top that is telling us this is the type of product you have on the website. There is lots of product information in those listings, and we can follow those listings to more information.” You don’t need to put this giant block of text on the bottom.

Having some amount of text makes sense, so maybe shifting that giant block of text into one or two sentences that you place above the fold below the heading is a good approach here because it also gives users a little bit more information about what they should expect on this page. That is kind of the direction I would head there. I would really try to avoid the situation where you are kind of fudging a page by putting tons of text on the bottom of the page just because the rest of the page is sub-optimal. Instead, try to find ways to improve the page overall so that you don’t have to do this workaround.

Summary: If there is no helpful text on a product page, Google may have trouble understanding it and ranking it well. Adding text that no one will read, that is just there for SEO benefit, is not likely to help.

Make sure important product pages are linked to appropriately from within your site. Add text that users will truly find helpful. Having at least some of that text above the fold is a good idea.

 

Do schema errors impact rankings?

16:22

John Mueller Help Hangout February 22 2019We look at this on kind of a feature level, where we say in order to present your site in this particular way, we need to have this kind of markup on the page and it needs to follow these guidelines and it needs to follow these requirements. For example, if you want to be visible as a recipe rich card in the search results, there are certain requirements that you need to do and we’ll look to see if those requirements are met. If the markup for those requirements are valid, that’s good. If there is other markup on the page that is invalid, then that is not a problem for us. It is really on a feature level. We want to show your site with a rich result for recipes, we see you have all the requirements, we’ll take all that and show that in the search results. We will present the site in a way that really encourages users to go and check those details out. If you have other markup on those pages that maybe matches one of the other features that is out there but is not valid markup yet, we just ignore that. From that point of view, it is not that you need to have zero errors for your markup. Rather, you need to think about which feature you’d like to kind of make use of in the search results and double check that the requirements for that feature are met.

Summary: Errors in your markup can make it so that rich features (such as review stars) no longer show in the search results. But, they should not affect organic rankings.

Does Google ever push out algorithm updates in one specific industry?

18:15

John Mueller Help Hangout February 22 2019We look at this on kind of a feature level, where we say in order to present your site in this particular way, we need to have this kind of markup on the page and it needs to follow these guidelines and it needs to follow these requirements. For example, if you want to be visible as a recipe rich card in the search results, there are certain requirements that you need to do and we’ll look to see if those requirements are met. If the markup for those requirements are valid, that’s good. If there is other markup on the page that is invalid, then that is not a problem for us. It is really on a feature level. We want to show your site with a rich result for recipes, we see you have all the requirements, we’ll take all that and show that in the search results. We will present the site in a way that really encourages users to go and check those details out. If you have other markup on those pages that maybe matches one of the other features that is out there but is not valid markup yet, we just ignore that. From that point of view, it is not that you need to have zero errors for your markup. Rather, you need to think about which feature you’d like to kind of make use of in the search results and double check that the requirements for that feature are met.

Summary: Not usually. If Google sees a need for a change in a particular industry, they will usually make changes that will help improve the results for that vertical, but that change will likely improve quality in other unrelated websites as well.

Can quality issues be passed via redirects during a site move?

20:04

John Mueller Help Hangout February 22 2019

So it sounds like you have everything setup, but maybe something went wrong along the way. What I would do is post in Webmaster help forum to get someone to really look at this specific situation and see if there was something that you overlooked. There are some situations where moving from one domain to another do lead to issues, specifically if the domain you’re moving to has a weird old history associated with it, that might take a bit of time to clear out, for us to recognize that this website is not related to the old one, and we should treat this as a new situation and not take the old situation into account.

Summary: If you are redirecting from a site that had quality issues, it can take some time for Google to assess this move.

Our note: John has said in the past that if you redirect links from a site with link quality issues to a new site, that you’ll pass on those bad link signals. Be careful!

What does it mean if you can’t rank in the Top Stories carousel?

23:37

John Mueller Help Hangout February 22 2019

Usually not, if you implement AMP, I think for mobile that’s something that’s required and I think for Desktop you don’t need to have AMP for the top stories carousel, not 100% sure on that. But if you have that implemented you’re covered anyway. Otherwise the top stories feature is and organic search feature, it’s not that you need to do anything specific to be visible, but rather we try to pick that up organically and show that when we think that makes sense.

Summary: You must be using AMP to be featured in the Top Stories carousel on mobile. Otherwise, John had no specific answer for this question.

How can I see Google SERPS as if I was searching from another country?

27:55

John Mueller Help Hangout February 22 2019

The thing that I usually do to check is on the one hand, go to that local Google version and other hand there’s an advanced search setting that allows you to see the results for a specific country.  You can get there fairly quickly by just adding or changing the Url query “ &gl= country code”. You can also you HL= and specific language code, if you want the search results in a different language. Both of those you can set in the advanced search setting in the result too. This won’t help with the sort of local search results, however.  So if you’re looking for a pizzeria in one city then that’s not something we have as a parameter.

Summary: Change the url query for the search results to add “&gl=country code” to the end.

Our note: We have more info here on seeing search results from another geolocation.

Should you have separate URLS for each variation of a product (i.e. different sizes, colors, etc.)

30:00

John Mueller Help Hangout February 22 2019

Thats a really common question that we get a lot for E-commerce sites. Unfortunately the answer is, it depends. By default I would prefer to have fewer pages and the advantage is its less to crawl so it’s easier to update and on the other hand, fewer pages means we can concentrate the value on fewer URLs, we don’t have to dilute it across different versions… Chances are that one product page will be more relevant in the search results in general because we are able to concentrate those signals, all of that value into that product page. The thing that I would call out as an exception here is if people are explicitly looking for something different, so one of these variations is very different the other one and it doesn’t make sense to combine them.

Summary: It’s often a tough call, but usually it’s best to have just one page in Google’s index.

(Our note: This is a good place to use a canonical tag to consolidate all of these versions.) The exception would be a situation where there is enough search volume for one of the specific types of this product to warrant it having its own page.

Is it better to produce one massive piece of content, or should it be divided into several different pages?

38:22

John Mueller Help Hangout February 22 2019

Unfortunately, the answer here is also, it depends, in that sometimes people are looking one big comprehensive piece of content and sometimes people are looking for individual pieces of content. So I don’t know if it would make sense to always go into the combined or always go in to the split route. What I’ve noticed from working with out tech writers is that sometimes content performs in ways that you  don’t expect and it’s worth testing to see how it works well for users. Kind of trying to figure out, are people actually going through that content and getting something useful out of it, are they converting in a way that’s useful for you? And based on that then making a decision, for example if you split an article up and they all start on page 5 instead of page 1 from the search results, is that still as useful to you and to users as if they landed on a big article where they had all of the comprehensive information. I don’t know, maybe there are ways that you can kind of make both of those work. So I’d really recommend testing this and not kind of blindly saying from 8, 000 I need to split it up into two chunks. Instead try to figure out that makes sense for your particular piece of content and what makes sense for your ultimate goal. Your putting this content out there because you want to achieve something specific so measure what that effect is based on those different variations that you’re thinking about.

Summary: It depends. We should be testing content like this to see what users prefer and what drives more conversions.

Our note: If you are splitting up content into several pages, be sure to only do this if it offers value to users. Don’t do it just to get more pageviews with more ad views.

 

If Adsense classifies pages as adult, does this mean that they are filtered in organic search as well?

42:30

John Mueller Help Hangout February 22 2019

So as far as I know Adsense does a lot of these classifications completely differently from search they use their own systems for this part of that kind of makes sense because they have different policies so when it comes to search we might choose to show things in one way but Adsense because they’re focused more on the advertising part of the kind of set up there, they they might have more restrictive policies where they say well this type of content is not something we’d like to place ads on or they just might have different different policies then we would have in search overall. So just because you’re seeing something happening from the ad side doesn’t necessarily mean that the same thing would be applying from the search side.

Summary: No. Adsense has completely different policies than organic search in terms of classifying content as potentially adult.

Can low quality content on one part of a site, such as a blog section, negatively affect the rankings of the entire site?

42:30

John Mueller Help Hangout February 22 2019

In general we do try to look at a website overall and if there are significant parts of the website that are really bad then that can have an effect on the the rest of the website’s rankings well. Usually when it comes to a situation where you have a blog and an e-commerce site that the e-commerce site is what what everyone is focusing on and the blog just provides a little bit of extra information and then if the blog is kind of bad then that doesn’t really affect the kind of the bigger chunk of the e-commerce site. The one kind of situation where it can play a role and I think this question kind of goes in that direction is that if the blog is set up in a technically bad way and that Googlebot has a lot of trouble crawling it. Maybe in that accessing the URLs is really really slow or it returns a lot of server errors, which in this case is kind of mentioned here, then what will happen there is we will reduce the crawling of that website in general. So what it’s not necessarily the case that we would drop it in rankings. Bad pages on a website like from a technical point of view if they just don’t work we try to drop those pages but if from a crawling point of view we have trouble crawling a significant part of that website then we will slow down crawling because we want to make sure that our crawling is not the reason why this website is performing badly.

So for example if we crawl the blog and we see a lot of server errors that we might say, well maybe we’re crawling too hard we don’t want to cause any problems we will reduce our crawling speed and if the e-commerce site is on the same setup as the blog then we would also reduce our crawling for the e-commerce site. So we track the amount of crawling that we do on a host level so if these two parts are on the same host then we would probably try to crawl them at the same rate and if one of them is really bad and that we can’t crawl it without a lot of server errors then we’ll reduce the crawling overall.

And for a lot of websites that doesn’t really matter we can still keep up with most changes on our website even if we don’t crawl as frequently for a large e-commerce site that can play a role. In particular if you have products that come and go and we can’t keep up with kind of crawling those products as they come and go then the search results for the e-commerce site will end up getting a bit stale and that’s something that maybe users will see in the search results.

So if they’re searching for a new I don’t know, a new phone and your e-commerce site has that new phone but we haven’t been able to crawl those pages yet then we wouldn’t be able to show your site in search results for that new phone. So that’s something where I would take a look at that and see what you can do to improve it so from a quality point of view again it’s less of an issue if if a part of the website is not really that great, obviously it can affect it as well. But especially from a technical point of view if we have significant issues with crawling a part of the website that we will crawl less pretty well and it’s not it’s not a penalty. It’s not something where we’re saying, oh this website is bad we will not spend so much time there, it’s more a matter of our algorithms trying to be good citizens of the web and saying well we want to be sure that we’re not the reason why this website is having so much trouble.

Summary: Yes, having low quality content on one section of a site can impact the ability for the entire site to rank well. For example, if there are technical errors in the blog that make it challenging for Google to crawl that section, it may slow down Google’s crawling for the entire site which can impact your ability to rank new pages.

Should you have a sitemap for AMP urls?

42:30

John Mueller Help Hangout February 22 2019

You don’t need to do that. Again the exception is if your whole website is AMP only then obviously those are your pages.

 

 

 

 

 

 

Summary: No, unless your whole site is AMP.

 

 

If you like stuff like this, you’ll love my newsletter!

My team and I report every week on the latest Google algorithm updates, news, and SEO tips.

Full Video and Transcript

 

Question 0:36 – On our website, the product team runs lots of calls to action. For one, they don’t even have an option to close that specific thing. When the user is going through the page, they’re showed 3-4 times a similar call to action. Is there any direct affect thing with this quality wise if the user comes in and they are being blocked from what they want to do? From Google’s side, how do they see these kinds of things?

Answer 1:30 – Specifically interstitials and pop ups that block the content, that is something we would pick up with the mobile friendly classifier that we have. We would probably treat those pages as not being mobile friendly and would not show them as high in the search results in general. That is something that you might see there. But I’d imagine that the bigger effect is more the long-term effect, where if people are coming to your website and you are essentially blocking them from seeing what they were looking for, why would they continue staying on your website? Why would they come back? That is kind of what I would look at there. A lot of times people trade the short-term wins against the long-term wins, where maybe in the short term people are going to click on that link on your interstitial a little bit, but in the long run, they are going to remember that this website is terrible and they are going to avoid it. That is kind of the way I would look at it there.

Question 2:40 – There are saying that they are showing this specific pop up after 10 seconds or so. Is it the same thing or is it okay?

Answer 2:53 – That’s the same thing. Who can look at the content on a page in 10 seconds? Most pages don’t even load in that time.

Question 3:30 – If I open Google News and type a specific query, many from my website are well ranked, but if I repeat the same query and this time click on the Google News auto suggest topic, none of my articles appear. It looks like Google can’t rank my pages for the auto suggested topic just for the query. What could be happening here?

Answer 3:53 – I don’t know. I really don’t know where these Google News topics come from. In general, the ranking within Google News isn’t really the same setup as the normal search results, so it is really hard for me to say there. What I would recommend doing there is going to the Google News help forum and maybe posting your example there, maybe with some screenshots so that people can see exactly what you were seeing.

Answer 4:45 – That is an interesting question. We are looking into what we can write up for e-commerce sites in general, so maybe we can include something like this, too. In general, if you have one page that is replacing another one, a redirect is a fine thing to do here. If it is something that you think is going to permanently replace a page, a 301 redirect would be a right one. If you think this is something that will change over time, or the redirect will maybe revert and won’t be redirected in the future, then a 302 would be the right approach.

From a practical point of view, there are two things that kind of play in here when it comes to Google. On the one hand, we try to differentiate between, “Should we index the content using the originating URL (in this case, that search query)?,” or, “Should we be indexing the content with the destination URL, which might be /video-games/xbox.” The 301 and 302 help us to make that decision. The 301 tells us you should prefer the destination page. A 302 tells us you should prefer the originating URL. That is something that kind of plays in there.

The difficulty that we find in practice is that the web is really messy. People do things in really weird ways across the web and we still have to try to figure out what it is that they actually meant here. For example, if we see a 302 redirect being in place for a longer period of time, then we probably are going to assume that this is not a temporary thing but something that is more of a permanent thing and we’re going to start treating that like it is a permanent change. That is one thing to keep in mind there.

The useful part here is that it doesn’t actually matter what URL is picked for indexing because we rank the page in exactly the same way. From that point of view, I would really focus more on which of these redirects is the right one to do for this situation and not worry about the SEO aspect because from an SEO point of view, it is really more, “Which of these URLs do we show in search?” and not, “Which of these gets page rank or how do they rank differently?” That’s all the same. We just show a different URL in search. From a ranking point of view, they would be equivalent.

Question 7:22 – Many e-commerce websites optimize their categories by adding a big chunk of text below the product listings and nothing except an h1 heading above the fold. I don’t consider this good usability considering users have to scroll all the way to the end to read this. Does Google treat this content the same as any other or would you, for improving rankings, consider putting the category text above the fold?

Answer 7:50 – This is something that comes up fairly regularly. One of the reasons why websites initially started kind of doing this kind of workaround is that it was really hard sometimes for us to rank category pages on e-commerce sites if there is no useful information on the page or no context on the page. As a workaround, people started stuffing whole wikipedia articles below the fold using a small font, sometimes using a link that says “more information” and pops up with a giant article of text. From our point of view, that is essentially keyword stuffing. That is something which I would try to avoid. I try to stick to really informative content and put that in a place where you think users would be able to see it, especially if it is content that you would like to provide for users.

More than that, I would think about what you can do to make those pages rank well without having to put a giant paragraph of content below the page. Things you could do here — kind of make sure that those pages are well integrated with your websites so that we have clear context of how those pages should belong in the website and what those websites are about. Another thing you can do, when you have that listing of products, is make sure there is some information on those listings that we can understand what it is about. Instead of just listing 40 photos of your product, put some text on it. Make sure you have alt text for the images and that you have captions below the images so that when we look at this page, we understand, “There is this big heading at the top that is telling us this is the type of product you have on the website. There is lots of product information in those listings, and we can follow those listings to more information.” You don’t need to put this giant block of text on the bottom.

Having some amount of text makes sense, so maybe shifting that giant block of text into one or two sentences that you place above the fold below the heading is a good approach here because it also gives users a little bit more information about what they should expect on this page. That is kind of the direction I would head there. I would really try to avoid the situation where you are kind of fudging a page by putting tons of text on the bottom of the page just because the rest of the page is sub-optimal. Instead, try to find ways to improve the page overall so that you don’t have to do this workaround.

Question 16:00 – If schema on a site has errors, how much impact does it have on indexing in the search results? New schema types approved by Google started throwing errors on the site which did not have errors before. What is going to be the impact?

Answer 16:22 – We look at this on kind of a feature level, where we say in order to present your site in this particular way, we need to have this kind of markup on the page and it needs to follow these guidelines and it needs to follow these requirements. For example, if you want to be visible as a recipe rich card in the search results, there are certain requirements that you need to do and we’ll look to see if those requirements are met. If the markup for those requirements are valid, that’s good. If there is other markup on the page that is invalid, then that is not a problem for us. It is really on a feature level. We want to show your site with a rich result for recipes, we see you have all the requirements, we’ll take all that and show that in the search results. We will present the site in a way that really encourages users to go and check those details out. If you have other markup on those pages that maybe matches one of the other features that is out there but is not valid markup yet, we just ignore that. From that point of view, it is not that you need to have zero errors for your markup. Rather, you need to think about which feature you’d like to kind of make use of in the search results and double check that the requirements for that feature are met.

Question 18:00 – When the team pushes out algorithm changes, are there times where the changes are aimed at just a specific industry, or are the changes made and they just happen to affect one industry more than the others?

Answer 18:15 – This is an interesting question in regards to generally how we work on our search results. From our point of view, it is usually not the case that we would say, “We need to do something specific to make the search results for one particular industry,” but rather, we look at it the other way around and try to think about ways we can improve search results with regards to search results for specific types of queries. It’s not so much that we focus on the industry, but we focus on the searches that people make. Obviously, they’re kind of related. If we see, for example, that people are getting confusing information from medical queries, then maybe we need to improve how we recognize the relevance of search results for medical queries. It’s not so much that we would target the medical industry and say, “We need to improve the way that these particular 10 sites are shown in the search results,” but more that we see users are confused with this type of query and it is something that is confusing a lot of people and we need to find a way to improve the relevance and the quality for those particular queries.

Question:  19:35 – I migrated a website from domain a to domain b, all 301’s have been in place from every old Url to the respective new version since day one, change of address request was submitted, domain a has an old sitemap updated, all URLs redirect to domain b. Domain b only has a new sitemap file update and all the current rankings are lost.

Answer 20:04 – So it sounds like you have everything setup, but maybe something went wrong along the way. What I would do is post in Webmaster help forum to get someone to really look at this specific situation and see if there was something that you overlooked. There are some situations where moving from one domain to another do lead to issues, specifically if the domain you’re moving to has a weird old history associated with it, that might take a bit of time to clear out, for us to recognize that this website is not related to the old one, and we should treat this as a new situation and not take the old situation into account.

Question 21:30 – Since Google plus is shutting down , is it possible that G can provide a way for publisher to port their followers and users to youtube.

Answer 21:40 – I don’t think so. There are ways to export your data from Google+ by using the take out feature… but I’m not aware of anything that would take someone’s who’s following you there and make them follow you on Youtube. I think that would be kind of stretching…

Question 22:22 – Question about ranking in the top stories carousel. The past few months we’ve noticed strange behaviour for our international news website. For the same query we rank very well in the news tab SERPs but for the same query we don’t appear in the top story carousel and for the same query on the other side we ranked very well for the video carousel. So my question is, there is a particular structured data we can implement in order to appear in the top story carousel. We correctly implemented the AMP pages, structured data, so we’re wondering if there is a particular implementation we have to do to appear in the top stories carousel?

Answer 23:37 – Usually not, if you implement AMP, I think for mobile thats something thats required and I think for Desktop you don’t need to have AMP for the top stories carousel, not 100% sure on that. But if you have that implemented you’re covered anyway. Otherwise the top stories feature is and organic search feature, it’s not that you need to do anything specific to be visible, but rather we try to pick that up organically and show that when we think that makes sense.  

Questions 27:35 – How to ensure that a GEO TLD have Geo targets that are not. So if I type hotels in New York in India then is shows Indian websites. So the question is how can I double check what kind go geo targeted quests look like.

Answer 27:55 – The thing that I usually do to check is on the one hand, go to that local Google version and other hand there’s an advanced search setting that allows you to see the results for a specific country.  You can get there fairly quickly by just adding or changing the Url query “ &gl= country code”. You can also you HL= and specific language code, if you want the search results in a different language. Both of those you can set in the advanced search setting in the result too. This won’t help with the sort of local search results, however.  So if you’re looking for a pizzeria in one city then that’s not something we have as a parameter.

Question 29:30 – We have two chairs, one in leather, one in fabric, both with separate URLs and different model numbers. Is this a problem or would they be filtered out for duplicate content?

Answer 30:00 – Thats a really common question that we get a lot for E-commerce sites. Unfortunately the answer is, it depends. By default I would prefer to have fewer pages and the advantage is its less to crawl so it’s easier to update and on the other hand, fewer pages means we can concentrate the value on fewer URLs, we don’t have to dilute it across different versions… Chances are that one product page will be more relevant in the search results in general because we are able to concentrate those signals, all of that value into that product page. The thing that I would call out as an exception here is if people are explicitly looking for something different, so one of these variations is very different the other one and it doesn’t make sense to combine them.

Question 32:00 –   If you have an e-commerce website and you get a lot of links to product pages which due to their nature expire, what can you do to those pages so that the link equity to those pages won’t be lost? Would you create a redirect rule that automatically redirects this page to a subcategory?

Answer 32:55 – In general people see this as more of problem then it actually is, for the most part if the content is so temporary that it expires regularly. Then usually that’s not something that people will link to  and maybes thats some you can encourage people to link to in different ways. For example if you know that this product is only going to last a couple of months then it’s not going to be available ever again and maybe it makes more sense to encourage users to link to the category of products instead or to your business instead of linking to this one specific product because in the long run, those links to the product that no longer exists maybe that’s something that doesn’t really make sense for other people as we. So what I’d recommend doing there is, on the one hand if you have products that change over time maybe that new product is a replacement of the old one and you can redirect from one old product to a new version. On the other hand if this is a informational landing page that is useful regardless of whether or not you sell that product, some of the information on that page might still be relevant to users. What I would expect from a search point of view, is to see a sort of soft 404 page. So you’re saying this product doesn’t exist anymore but you’re still showing some content instead… There are lots of subtle edge cases here so I’m hesitant to say that everyone should do it like this or everyone should do it in another way. There’s some really neat write-ups on how to handle expired content out there, so I’d look around to see what options are available and =what options makes sense in your specific case and maybe there are mixes that you can do as well where you say in the first month or so when the product is not available, you do this and then after a year I do something completely different or I just return a 404 because it’s really gone and nobody should care about this. But in general, I wouldn’t care too much about those links especially if they’re to products that are really temporary by nature because if you’re building your whole users experience up around kind of how Google uses those specific links then the chances are you’re taking a bigger hit by having a bad UX then you would ever gain by tricking google into thinking that those two links that went to one expired product are now relevant to a different thing.

Question 38:00 – Lets say I have big 10,000 word piece of content, for example a guide divided into 10 chapters. From an SEO perspective is it better to publish it as a single page or to split each chapter on different dedicated page?

Answer 38:22 – Unfortunately, the answer here is also, it depends, in that sometimes people are looking one big comprehensive piece of content and sometimes people are looking for individual pieces of content. So I don’t know if it would make sense to always go into the combined or always go in to the split route. What I’ve noticed from working with out tech writers is that sometimes content performs in ways that you  don’t expect and it’s worth testing to see how it works well for users. Kind of trying to figure out, are people actually going through that content and getting something useful out of it, are they converting in a way that’s useful for you? And based on that then making a decision, for example if you split an article up and they all start on page 5 instead of page 1 from the search results, is that still as useful to you and to users as if they landed on a big article where they had all of the comprehensive information. I don’t know, maybe there are ways that you can kind of make both of those work. So I’d really recommend testing this and not kind of blindly saying from 8, 000 I need to split it up into two chunks. Instead try to figure out that makes sense for your particular piece of content and what makes sense for your ultimate goal. Your putting this content out there because you want to achieve something specific so measure what that effect is based on those different variations that you’re thinking about.

Question 40:00 – I noticed a few big publications listed in Google News are back dating some of the news articles to fool Google search users into believing that they’re the first source of that news can Google detect this? How does google act on this? Where can when you report such findings?

Answer 49:18 – We use multiple methods to figure out what the right date is for a page. So sometimes people put a date on the page and we say well this is not correct and we’ll treat it as something else. So that’s something where I don’t know if I would assume that just by backdating something you would have any kind of kind of preferential visibility in the search results. So I kind of questioned that that part of the question there. One of the things I have noticed though is that a lot of publications have trouble specifying dates in a way that are kind of reasonable for Google and in a way that are hard to misunderstand. So sometimes it’s something as simple as the date format where we can’t recognize that this is actually a date. Sometimes it’s something kind of tricky in that sometimes there’s a time zone specified, sometimes there isn’t, sometimes they’re using structured data to specify a date and a time and then on the page that information is not available. All of these things can make it really hard for us to pick the right day sometimes it’s not so much a matter of publisher trying to mislead Google but rather Google kind of being confused by what a publisher is providing and those situations are always interesting for us. So you’re welcome to pass those on to me so that we can take a look to see what what is actually happening here? How did we get confused? What could we do to make it so that webmasters and publishers understand better how to provide dates that work well for Google as well?

Question 42:10 – When it comes to categorizing content Adsense is classifying some of our pages as adult and after a manual review lifting this restriction this Adsense feed into or is it linked to the search categorization or the do the two platforms use the same algorithms or talk to each other?

Answer 42:30 – So as far as I know Adsense does a lot of these classifications completely differently from search they use their own systems for this part of that kind of makes sense because they have different policies so when it comes to search we might choose to show things in one way but Adsense because they’re focused more on the advertising part of the kind of set up there, they they might have more restrictive policies where they say well this type of content is not something we’d like to place ads on or they just might have different different policies then we would have in search overall. So just because you’re seeing something happening from the ad side doesn’t necessarily mean that the same thing would be applying from the search side.

Question 43:22 – When will structured data testing tools start showing schema injected through tag manager in JSON-LD?

Answer 43:30 – I don’t know. I have seen that question pop up again on Twitter so I’ll definitely bring that up with the team as well to see what we can do to make that a little bit easier. In general using the the tag manager to inject things like structured data or to inject other kind of search related functionality into the page is something that you can do. It’s something that we often pick up but it’s something that’s a lot harder to diagnose and it’s a little bit fragile. So that’s something where I would recommend if if at all possible to make sure that you can inject the structured data directly on the page. That way you can use all of the testing tools out there to determine that it’s working correctly and you can be sure that Google search is always taking that into account. Whereas if you use tag manager for some of these things we can pick that up when we render the page it takes a little bit longer to get there and if anything happens to kind of subtly break along the way towards rendering that page then it might happen that we don’t make it and it probably will happen that other search engines won’t be able to pick that structure data up either. So I’m I’m okay with using tag manager it kind of has a stopgap solution where until you can actually change those pages but I would really recommend it in the long run to make sure to put that structured data directly on the page so that you don’t have this kind of unclear situation.

Question 45:15 – When website rolls out a new tech stack progressively and Googlebot sees some sections of the site and an old stack and some on a new site stack. Is this scenario something crawlers and users will be routed to what should be monitored to ensure that all go smoothly and organic search isn’t impacted?

Answer 45:40 – So I think that last part is something you need to be aware that it’s essentially impossible to guarantee that if you make bigger changes across your website that organic search won’t be affected by that. There are positive and negative aspects there, on the one hand you might be making bigger changes on your website especially because you want organic search to be impacted because you want to rank better right so that’s something we should be able to pick up. On the other hand it’s very possible that you roll out a new tech stack that breaks a lot of things that used to work well for search. So I would kind of say that anytime you’re doing a bigger revamp of a website you have to assume that this can affect search and it’s worth getting help from SEOs getting help from developers from other people to kind of double-check things ahead of time rather than to try to go in afterwards and say, well this didn’t work our website disappeared from search what do we do now, because if you’re trying to fix it afterwards it’s always going to be a much bigger struggle. It’s going to take a lot longer and it might be that you really kind of break things in a way that have a longer term impact that take a lot longer to actually fix then if you had the right help and kind of right advice from the beginning before they roll out. Obviously like sometimes schedules work out in bad ways and sometimes the right people are not involved in the right steps, like things happen. So you can’t always prepare for everything. With regards to gradual rollouts, I think that’s a good approach to take and that’s something where you can see how how search engines start crawling the the new stack that you provide if you don’t change the URLs which is usually an ideal situation. Then you can see how those URLs kind of change their performance in search. You can see that in search console kind of doing clicks and impressions for those URLs and if you give it a little bit of time to settle down during your rollout then usually you can kind of make sure that okay this is working I can take the same kind of setup and apply it to larger part of the website and really progressively roll that out.

Question 48:26 – Can a really bad blog cause an entire website to drop in rankings?

Answer 48:50 – In general we do try to look at a website overall and if there are significant parts of the website that are really bad then that can have an effect on the the rest of the website’s rankings well. Usually when it comes to a situation where you have a blog and an e-commerce site that the e-commerce site is what what everyone is focusing on and the blog just provides a little bit of extra information and then if the blog is kind of bad then that doesn’t really affect the kind of the bigger chunk of the e-commerce site. The one kind of situation where it can play a role and I think this question kind of goes in that direction is that if the blog is set up in a technically bad way and that Googlebot has a lot of trouble crawling it. Maybe in that accessing the URLs is really really slow or it returns a lot of server errors, which in this case is kind of mentioned here, then what will happen there is we will reduce the crawling of that website in general. So what it’s not necessarily the case that we would drop it in rankings. Bad pages on a website like from a technical point of view if they just don’t work we try to drop those pages but if from a crawling point of view we have trouble crawling a significant part of that website then we will slow down crawling because we want to make sure that our crawling is not the reason why this website is performing badly.

So for example if we crawl the blog and we see a lot of server errors that we might say, well maybe we’re crawling too hard we don’t want to cause any problems we will reduce our crawling speed and if the e-commerce site is on the same setup as the blog then we would also reduce our crawling for the e-commerce site. So we track the amount of crawling that we do on a host level so if these two parts are on the same host then we would probably try to crawl them at the same rate and if one of them is really bad and that we can’t crawl it without a lot of server errors then we’ll reduce the crawling overall.

And for a lot of websites that doesn’t really matter we can still keep up with most changes on our website even if we don’t crawl as frequently for a large e-commerce site that can play a role. In particular if you have products that come and go and we can’t keep up with kind of crawling those products as they come and go then the search results for the e-commerce site will end up getting a bit stale and that’s something that maybe users will see in the search results.

So if they’re searching for a new I don’t know, a new phone and your e-commerce site has that new phone but we haven’t been able to crawl those pages yet then we wouldn’t be able to show your site in search results for that new phone. So that’s something where I would take a look at that and see what you can do to improve it so from a quality point of view again it’s less of an issue if if a part of the website is not really that great, obviously it can affect it as well. But especially from a technical point of view if we have significant issues with crawling a part of the website that we will crawl less pretty well and it’s not it’s not a penalty. It’s not something where we’re saying, oh this website is bad we will not spend so much time there, it’s more a matter of our algorithms trying to be good citizens of the web and saying well we want to be sure that we’re not the reason why this website is having so much trouble.

Question 52:36 – Does Googlebot follow the sitemap command in robots.txt file?

Answer 52:41 – We do pick up the sitemap file there but Googlebot doesn’t crawl it as a link we process sitemap files individually there XML files they’re not pages with links in there. So Googlebot wouldn’t follow it as an HTML page it would kind of request it as a sitemap file and process it with normal XML processing instead.

Question 53:08 – In a media website I have a responsive version of the home page sections news and the news also have AMP versions do you recommend having AMP version of the home page and the alternate section so that Google finds the aunt version of the news first. Does it have any effect on speed of indexing?

Answer 53:30 – So from from an indexing point of view I don’t see any reason why you would need to do this we pick up aunt pages when they’re connected to the normal pages by crawling the normal page first and then we see the link to the AMP we follow that pick up the AMP page as well. But primarily we crawl through, the kind of normal or the legacy version of the page or responsive version I don’t know how you would call it, we would crawl it with the non AMP version. The exception is of course if your whole website is AMP then obviously we will crawl that. With regards to having AMP versions of the homepage and the sections, ultimately that’s up to you. So that’s something where if you’re seeing users go to individual news articles and they click on the home button there and you send them to your responsive home version maybe that’s okay maybe that’s confusing for users. Ultimately that’s up to you. Usually those pages would not be kind of shown in the in a new section where we do pick up the AMP stuff. So probably that’s less of a direct issue more something in kind of indirect and long-term.

Question 54:52 – Do you recommend having a sitemap file for URLs in the AMP version?

Answer 54:56 – You don’t need to do that. Again the exception is if your whole website is AMP only then obviously those are your pages.

Question 55:53 – I have some customers and they have text in the bottom line on this page and today it ranks good, you know, we have traffic, sales. But you know this text are not good because it was written for robots, you know, not for people and today we don’t know what to do if we delete this text we can lose our rankings and second we don’t know how we can integrate this text. What do you propose?

Answer 56:26 – Yeah I I would look at this as more of a midterm or a long-term project to think about ways that you can integrate part of that text better within the normal part of the page. So it’s not the case at least at the moment that we would look at those pages and say, oh this is terrible we we should demote the website for doing this, but rather we we try to pick up additional content there and to some extent I think that additional content is useful but if you can’t just like squeeze it into the rest of the page then try to think about ways that you can do that in the long run. Especially what I would recommend doing is try to find ways to reduce the amount of text that you’re providing like that. So instead of this big Wikipedia article on the bottom reduce it maybe to a couple of sentences so that it’s still useful for users when they look at it and still provides the right context for search engines when they look at the page overall.

So it’s it’s not something I would say is like a critical problem that you need to solve immediately but rather is something where maybe people have just been using this as a way to, I don’t know, work over other deficiencies on that page in the beginning and maybe it’s better to just fix those deficiencies in the long run than to kind of continue building out these long texts that nobody reads.

Question 58:03 – The problem we have today is that we have some keywords on this text and it’s connected to 50 words what we get rank for from Google and we think if we write some interesting text for clients, for customers and we can’t use the same keywords would it be better to re-write it?

Answer 58:32 – I don’t know, I will try to to look at ways that you can make the text work for both but again I would look at this as something more long-term and not say, oh I need to rewrite everything today otherwise Google will penalize me, but I think a lot of times people put a lot of text on these pages specifically for search engines and they don’t realize that search engines ignore a lot of that text already. So if you can reduce it so it’s less keyword stuffing then maybe it even works better but it’s something that sometimes takes a little bit of practice and finding the the right approach to making sure that you cover the the content that people are searching for and also provide the context that people need when they look at that page individually.

Question 59:24 – I usually write the blog posts for my Russian blog and I need to use a lot of alt text in any pictures but if I write step-by-step guides, information like this, you know, I use a lot of screenshots do I need to put all that in there because I’m not sure that they need to get ranked for this screenshot from Google image.

Answer 59:52 – That’s up to you. That’s totally up to you. So I would look at this more as how would people searching visually come to my website rather than how can I get as many images as possible into Google images. So from from that point of view if you’re saying these screenshots help those pages but nobody is going to look for this screenshot then like, why bother? No need to kind of go into too much detail there obviously for screen readers having the alt text is useful so some amount of alt text is useful there but if if you know that people are not coming through with Google images to your website for those images then you can write your text in a slightly different way. You don’t have to worry about kind of the search image search targeting part of that.

 

 

Google update newsletter

Want an update when Google makes a big algorithm change or other announcement? Sign up here!

This is a weekly newsletter. We will never send spam. Unsubscribe at any time. Powered by ConvertKit
Leave a Reply

Address

300 March Rd., Suite #300
Kanata, ON
Canada
Website: https://www.mariehaynes.com
Email: Contact the MHC team at help@mariehaynes.com

Privacy Policy
Referrals

HIS Web Marketing

Marie Haynes is the founder of HIS Web Marketing, formerly at www.HISWebMarketing.com. In 2015, she rebranded the company to Marie Haynes Consulting Inc.
Stay updated on Google algorithm changes and search news. Sign up here for Marie Haynes' newsletter.