In this help hangout IRL we are joined by Martin Splitt (far left) of the JavaScript Team at Google and some folks to ask questions in person. There were some great questions about JavaScript, Manual Actions and Indexing. Here are the most helpful questions and answers provided by the community, Martin and John. Full video and transcript can be found below!

My question is about Image Sitemaps. Is Google already smart enough to find the image URL of their respective source code on the pages?

3:03

We do find it with the source code. But the image sitemaps helps us a little bit to understand that it's really something that you want to have indexed. So I would see it more as a decision on your side. Do you want to have your images indexed or not? And for some websites the images are not that critical, because it's just decorative or used more for design. But for others maybe people are searching using image search. And they'd like to find something on your website. I think it's something that a lot of SEOs don't spend too much time on now, which is kind of I think a shame, because for some types of activities people search with images quite accurately. So it's really something to think about. How do you want people to come to your website? And how might they search with images for your website?


Summary: Google can find your images in the source code but image sitemaps help Google understand the image better. Especially if you want the image indexed and be easily discovered in image search. If you want to found in the image results we definitely recommend submitting an image sitemap.


 What about the size of images does Google realize the different sizes?

4:27

Yeah, we try to recognize the different sizes. And we can understand that the same image is available in different sizes. Usually we'll try to pick a higher resolution image for image search just so that we have something nice to show to people. But that doesn't mean that you need to like gigantic samples. More like for a larger sized user kind of that size image not something like you have to deliver the original file-- anything crazy.


Summary: Google can recognize different sizes and can understand that an image is available in different sizes and will Google will pick the higher resolution image for image search.


We have noticed that since we moved to React, our category pages show an error in the Google cache after two seconds. So first they displayed the page, and then after two seconds an error page. 

7:42

I suspect that something with JavaScript on your side because if the page loads initially then, I mean, it's the cached version, it's OK. And we can run some kind of JavaScript on the cached page if it's within the cached page. So, probably, that JavaScript is running and for whatever reason, you're picking that up and then turning into an error page.


Summary: Google can run JavaScript on the cached page if it's within the cached page.


I noticed that if I do a minor change on a page, say, add a few things to a product, I notice that I have a drop in ranking and then it comes back up.

9:23

In general, what happens when you change the content of the page, we will try to reprocess the page and reindex the page. And sometimes that does mean that we have to re-evaluate how we rank the page. But it's not the case that we would move any existing pages just to page two or drop them in ranking just because they changed. We're just trying to understand what has changed on this page and index it in that way. One thing to maybe watch out for is, if you're also changing the URLs on these pages, so if you're moving things around if, instead of updating the existing page, you move it to a different location, then that does mean that we have to re-evaluate everything. And then you will definitely see some, at least temporary changes in ranking.  


Summary: If the content on a page changes significantly, or even if the positioning on content on a page changes significantly, Google may re-evaluate the page and rankings may change.


Can changing a theme make a difference in how a site ranks?

15:22

So changing the theme of the website. Sure. That can have an effect on the ranking as well. Most of the time, the change that you would see there if you change the theme is all of the internal links are suddenly different, as well. So maybe you have a lot of internal links in the menu on the side, and with the new theme you just have a smaller set of internal links or related topics that are on the bottom of the page. Maybe you have more or less of that. And so that's something where if you change the theme of the site, then all of those internal links can theoretically change. And that can definitely have an effect on how we crawl and index the site.


Summary: If you change a theme of a website it usually changes the internal links. For example if you had internal links on the side of your page with one theme and with the new theme there are less links pointing to your content, that can cause issues and effect how we crawl and index your site.


If a web page mentions your brand but doesn't actually link to your website, would that count as a positive signal?

23:28

So I mean, we would pick that up as a mention on another web site, but it's not a link. It doesn't pass any page rank. So it's kind of not really the same thing as a link.


Summary: A mention is not the same thing as a link, but Google can still see it. Our note: This is probably something that is used in Google’s assessment of E-A-T. For example, if the New York Times mentioned you, but didn’t link to you, you won’t get PageRank, but you may get an improvement in Google’s perceived authority of your site for your topic


Is it a problem if your staging site gets indexed?

23:52

 It does sound like we're maybe indexing your staging site, which probably you don't want to have happen. I've seen lots of weird problems come up when that happens. So I'd recommend blocking that in whatever way works best for you, which could even be blocking it by user agent if you can do that or blocking it by robots.txt in a worst case so that it doesn't get indexed there.


Summary: You do not want to have your staging site indexed in search! Our note: Not sure if it is indexed? You can do some site: searches to find out. For example, if your staging site is on dev.example.com, do a search for site:dev.example.com to see what is indexed.


We're currently using Google Tag Manager for structured data. Is that OK or not?

26:45

It's OK. But I think what you need to keep in mind is that the Tag Manager needs to be run by JavaScript. So it only works when we can render your pages, which also means that there is sometimes a delay between when we pick up your page, and when we process the JavaScript, and render the pages, and are able to pull out the structured data with the Tag Manager. So that's one thing to keep in mind in that there's a bit of a delay there. The other thing is that you're adding quite a bit of complexity for just a few snippets of JavaScript that you're injecting into your pages, which means things can go wrong along the way. And you might not necessarily notice it when you're looking at those web pages. So that's something where I would recommend, if at all possible, try to get the structured data directly into the pages so that it's a lot easier to diagnose when things go wrong, when things break, and to fix them early on.


Summary: Tag Manager needs to run by JavaScript which means that it can only be picked up when Google processes the Javascript to pull out the structured data, which could take some time. John recommends to get the structured data directly into the page so it's easier to diagnose if things go wrong.


Speakable structured data, is there a time-frame when it's coming outside of the US?

28:00

I don't know. A lot of times, we test it on the US market probably because we have a lot of people in the US, but also because maybe they like to try stuff out. And if it works out there, then usually we end up rolling that out to other countries and other languages. With speakable markup, maybe it's also a matter of being able to understand that and being sure that the quality of this content is OK. So what I would recommend there is to implement it on your pages and kind of be ready for when it does roll out for other countries.  


Summary: Currently it's only tested in the US market but what John recommends is to implement it on your pages so your site will be ready for when it does rol out for other countries.


If we ignore manual actions, what will happen?

28:46

That doesn't sound like a particularly good strategy, I think. So manual actions are done from the Web Spam team. When they see something significantly wrong that they feel they need to manually fix that, so if you don't fix those issues, then those issues will remain. And that manual action will remain. And whatever that manual action does will kind of persist. So these manual actions do expire, at some point. But we're talking more about order of years, rather than weeks or anything like that. So if you want to wait out a manual action for a couple of years, then that's your decision but probably not a good short-term one. So I personally recommend taking these into account, and taking them as feedback, and fixing them, and doing the reconsideration.  I mean, if your website isn't meant to be shown and searched, then maybe you don't care about manual actions. If you're just trying to do do ad landing pages, then maybe that's fine


Summary: It is not a good idea to ignore a manual action. They do expire over the course of a few years but it's recommended that you take it as feedback and fix them and doing a reconsideration.


Does Googlebot use cookies?

31:28

Googlebot doesn't keep a cookie

 

 

 


Summary: Googlebot doesn't keep a cookie.


 

How homepage can really be completely different. If we see that the user is interested in sport, then we would probably show 80% of the stuff as sport. And, for another user, it will be economy news or something.

33:20

Yeah. That's totally up to you. That's something where you have to work with your users to find the right balance. But, from our point of view, people would be going to your Home page because they're kind of searching for your brand or for general swiss news. And they go to Home page. And then your home page is still relevant even if it has different content there. I think, especially for news sites, that kind of change happens naturally, anyway. We crawl it at one point, and we seen this collection of articles. The user goes there. There's a different collection because things have changed. And that's kind of normal.


Summary: It is completely normal to have different content served for different users especially for news sites where content changes naturally anyway.


We have regional category pages for wine tasting for different cities. They have been dropped from the index because Google says they are duplicate. When in fact these pages are really unique and have individual content and individual links. And the events on them are in different cities, so there is actually no way of confusing them. So I don't really understand what's happening there.

34:16

So, usually, this happens when we think that a part of your site is duplicated, so kind of like how you saw. And it's not always the case that we look at the content and say, oh, the content is exactly the same. But sometimes the case that we look at the URL structure, and we see maybe the city name is-- I don't know-- a URL parameter, and you can enter any kind of word as this URL parameter and still get one of those cities, then we might decide that maybe this URL parameter is irrelevant-- that it's not necessary for determining the content on the page. So that's kind of one place where this could be coming from. It could also be that, maybe for some cities, you're showing the same content. So, for example, if there are cities that are right next to each other, maybe you show the same content because maybe the events are the same for these two locations. And that might be confusing us. So what I would do there is double-check to make sure that the difference that you have between the individual locations is really clear, that the URLs are such that, when you enter an invalid city name, that you show a 404 page, instead of picking another city, and kind of making sure that, for us, it's really clear that these parameters, or the path, or whatever you're using for that is really always leading to a unique, different page.


Summary: Double check to make sure that the difference between the individual locations is really clear and that the URL shows a 404 page when an invalid city is typed in instead of picking another city. Make sure that the parameter or the path is really clear and leading to a unique different page.


A question for Martin regarding single-page applications. It's just approximately. When do you think Google will be able to render a client side website completely with the first Crawl?

38:21

MARTIN -  Right. So we are working right now, as we speak, on getting rendering and crawling better integrated. Everything moves a little closer together. I can't give you a timeline on that. It's a work in progress. We also try to improve rendering, to begin with, because, if you ask, can you index single-page applications, yes. We can do that today. And the rendering gap is not always as big as people think. So I wouldn't worry too much about it. The problem is that the client side rendering also influences how quickly we can que things for crawling. So the crawl budget plus the rendering is something that is really hard to spot from the outside. So you only see, like, oh, my page takes a day, or two, or three to be indexed. That doesn't necessarily mean that it's a rendering issue. It might be a crawl budget issue. It's a tricky one to figure out. I've seen that a few times now. And, even though that's an issue that we understand, it might take a while for us to bring things closer together. So, for the time being, if you really care about frequently changing content and with lots of frequently-changing content, I would still recommend dynamic rendering. That's not something that's going to pop and disappear in a month or so. We are working on improving it. It's going to take a while. I'm not going to give you a timeline.

JOHN - Yeah. I think, especially if you have a website that has a lot of content, you should really be using dynamic rendering or server side rendering. I also just today saw that there was a blog post from Bing a few months back that they were also saying use dynamic rendering for things like that. So I'd really kind of invest in that direction. And, like Martin mentioned, it's not going to go away. So social media sites-- also, if you want to share things, or if you want people to share a URL on WhatsApp, then it also needs to be pre-rendered for that.


Summary: Google is currently working on improving the rendering gap. But there is no timeline on a release date. If you have a website that has a lot of content or changing content we recommend using Dynamic Rendering.


How long will it take when we fix a manual action?

50:17

So, usually, that needs to be processed manually because it's a manual action. So someone needs to take a look at that and review the changes that you've done. And, usually, you get a reply back within about a week or two. So if you fix these issues, then I would go ahead and submit the reconsideration request. And, usually, within about a week or two, you should have an answer there, with regards to yes, this was OK. We will resolve this manual action. Or no, there are still issues here. Sometimes, depending on the manual action, it can also be that we see you fixed a lot of this, but you haven't fixed everything. So we will reduce the scope of the manual action. With structured data, that might be a bit trickier.


Summary: Usually within a week or two we you were hear back about the reconsideration request. Depending on the manual action.

*Marie’s Note: We have had cases where it has taken as long as six weeks to receive a reply back from a manual action. Mind you, these were a few years ago.


How can we ask for reconsideration?

52:03

That's within Search Console. Where you see the manual actions, you can submit a reconsideration request. It's fine to do multiple of these if you're working on a problem, and you're trying to resolve it. On the other hand, if we see that you're just going back and forth, and fixing things, and getting manual action again, then kind of moving things, trying things back and forth, then it might be that the Web Spam team says, OK. They're just toying with us. We will ignore their reconsideration request for a while or kind of stop them from submitting more of them. So I would see this as something where you really try to fix the problem, do the reconsideration, and react based on the feedback there. I would also get help from the Webmaster forum if you're not sure about what exactly you should be fixing because, a lot of these issues, you're not the first one to run into this problem. And people in the forums, they have practice with this. So I'd definitely get help.


Summary: Within Search Console where you see Manual Actions you can submit a reconsideration request there be sure that you are actively trying to resolve the issue.


I've seen sites that are, within a month or two, from less than a thousand to a million traffic on an auction domain. It's ridiculous.

57:28

Those are always interesting for the Web Spam team. So if you have some examples, they like looking at that. It's not so much a matter of us taking these examples and saying, oh, this one specific site is doing something wrong. But rather taking those and saying, well, this might be a general problem. And we might be able to recognize this better on a broad scope with help from some examples. So that's something where we try, from a Web Spam point of view, not to focus too much on individual URLs, but rather try to figure out what kind of is the system behind this so that we cover all of these cases, rather than just picking that one specific one out.  


Summary: The Webspam Team is always on the look out for any systems behind creating spammy websites.


 

If you like stuff like this, you'll love my newsletter!

My team and I report every week on the latest Google algorithm updates, news, and SEO tips.

Full Transcript

 

Question 3:03  - My question is about Image Sitemaps. Where could be the added value for the webmasters? And does it still need those kind of sitemaps today? Or is Google already smart enough to find the image URL of their respective source code on the pages?

Answer 3:32 - We do find it with the source code. But the image sitemaps helps us a little bit to understand that it's really something that you want to have indexed. So I would see it more as a decision on your side. Do you want to have your images indexed or not? And for some websites the images are not that critical, because it's just decorative or used more for design. But for others maybe people are searching using image search. And they'd like to find something on your website. I think it's something that a lot of SEOs don't spend too much time on now, which is kind of I think a shame, because for some types of activities people search with images quite accurately. So it's really something to think about. How do you want people to come to your website? And how might they search with images for your website?

Question 4:27- I also have a image question. What about the size of images, because we have discussions with designers that have really high resolution pictures on big screens. And then we do it automatically smaller for the mobile. And does Google realize the different sizes or yeah?

Answer 4:43  - Yeah, we try to recognize the different sizes. And we can understand that the same image is available in different sizes. Usually we'll try to pick a higher resolution image for image search just so that we have something nice to show to people. But that doesn't mean that you need to like gigantic samples. More like for a larger sized user kind of that size image not something like you have to deliver the original file-- anything crazy.


Question 5:23  - What about each robots.txt to no index images, because we have the case that we have also multiple versions of the image? And we want to make sure that we don't get in trouble for legal cases if we must delete. So we want to make sure that just the version we allow is in Google Search.

Answer 5:52 - So usually what I'd recommend is using robots.txt to just block the other versions, especially if you have like different sizes and different folders or different URL patterns. Maybe that's considered recognizable. But I don't know if we would use xrobots with that.

Question 6:57  -:But you would recommend the robot.txt from there?

Answer 7:00 - Yeah.

Question 7:01 -  OK, but robot.txt is just like saying don't crawl it. But also then say don't index it.

Answer 7:09 -  Well, for images, we really can't show anything if we can't crawl it. For web pages, we can guess and say, this is like a title that we found some links, but for images, we can't guess based on the text. You have an alt text that says, like, a picture of a beach with a boat, and it's like, OK. Cool. Can't really guess what it might look like. Now, cool.

Question 7:42 - I have a question regarding Google Cache. And we have noticed that since we moved to React, our category   pages show an error in the Google cache after two seconds. So first day, they displayed the page, and then after two seconds an error page. And this a problem for Google. Our dev thinks this is due to our phishing protection.

Answer 8:07 -   I suspect that something with JavaScript on your side because if the page loads initially then, I mean, it's the cached version, it's OK. And we can run some kind of JavaScript on the cached page if it's within the cached page. So, probably, that JavaScript is running and for whatever reason, you're picking that up and then turning into an error page.

MARTIN-  Sounds like the phishing protection kicks in.

JOHN - Yeah. Maybe something like that. I don't think for a search it would be a problem because if we can render the pages and get the content, that's fine. But, I don't know, it might still be where it's looking at and trying to fix. But in general, also with JavaScript sites, the cached version is the static HTML version that you gave us. So if the whole page is built on JavaScript, then we might not be able to show that in the cache.

Question 9:23 -  John, yes. So my question is, basically, changing a page. So I have an affiliate page. I noticed that if I change-- if I do a minor change on a page, say, add a few things to a product, I notice that I have a drop on a ranking, it comes back, comes back up. However, I do want to change my entire product, as it's no longer best. I'm concerned if my page will go to the second page because of that. That Google reevaluates the links that are coming to that page after the page is changed slightly or significantly.

Answer 10:20 - In general, what happens when you change the content of the page, we will try to reprocess the page and reindex the page. And sometimes that does mean that we have to re-evaluate how we rank the page. But it's not the case that we would move any existing pages just to page two or drop them in ranking just because they changed. We're just trying to understand what has changed on this page and index it in that way. One thing to maybe watch out for is, if you're also changing the URLs on these pages, so if you're moving things around if, instead of updating the existing page, you move it to a different location, then that does mean that we have to re-evaluate everything. And then you will definitely see some, at least temporary changes in ranking.  

Question 11:17 -  I have the exact same question, more or less. So let's say you have an established page that has been ranking for several years for second page, and then you decide to overhaul and improve the whole content of the page. Like, it goes from 1,000 to 2,000 or 3,000 words. So it's my understanding that the re-evaluation will take place and I heard from friends that they lost a lot of rankings when they did exactly that. So would just adjusting be to better add the new content step by step, or what's your take on this?

Answer 12:00 - I mean, usually, the idea is not that we drop a page in ranking just because it gets updated. Usually, it's more a matter of the new page is different and the new content is-- we try to take that into account. So that's something where, if you improve things for search, if you work with clearer structure, add headings and add text and images, all of those things, then you should see it go up. So it's not the case that every change that we see on this site means that we will kind of drop it in ranking first until we have it understood. So it should be something where, if you're making changes, you can see positive effects, you can see negative effects, depending on what you change.

Question 12:43  - But it's not like you start from zero again.

Answer 12:46-  No. If it's the same URL, if it's the same page, then we start with the same basis and we say, well, there's new content here. We have to see how we can work with this new content. That's kind of normal, right? A lot of page exchange over time.


Question 13:02 - And what's the difference if we try to do the change step by step before add small fonts or new content step by step to avoid the re-evaluation. There are differences in the outcome at the end

Answer 13:16 -  No. There shouldn't be. So it's not something that you can really avoid because every time we recrawl those pages, there are small, subtle changes anyway, so we have to reprocess it every time we recrawl it. Even if it's changes, even if it's just the date on the page that was changed, it's a new page. We have to figure out how to rank it now. But it's not that we throw away all of your old signals that start over, it's more that the content has changed, we can take some of the signals and keep that and some of them are based on the content and we have to work with those again.


Question 13:51 - But why does it drop in the rankings sometime or often times?

Answer 13:56 - I think that's really just a matter of us understanding the page. That's like-- sometimes we think a page isn't as relevant anymore as it once was, and sometimes we think a page is better than it was before. So that can happen, too. So, usually, my recommendation there is to say that to not artificially slow things down, because if you're improving the content, if you're making the page better, then why would you kind of hold that back and not show it to search engines, and not show it to users, but rather make those changes. And if there is a temporary drop, then that's something-- maybe you have to wait it out and see what happens. And maybe it's also a sign that the changes you made maybe weren't that good for SEO things. So that's something where sometimes it makes sense to do it step by step, so that you can recognize this fairly quickly, especially if you're changing a big website. If you change the whole website at once and it goes down in search, then it's a lot of work to figure out what happened, what went wrong. But if you take maybe 10 or 20 pages and you change those and you see, oh, this worked and this one didn't work, then you can kind of improve things until you're finished.

Question 15:22 - Continuing on that question, if I do-- so my problem is I used a theme on this blog, basically, the theme is open content on the home page, and then one, two, three pages. Do you understand what I'm saying? I got a blog and that basically shows the whole thing on the home page. And then it trickles down to other pages. Now, the problem is, if I want to change the entire theme of the website, would that have an effect to my entire ranking?

Answer  15:58- OK. So changing the theme of the website. Sure. That can have an effect on the ranking as well. Most of the time, the change that you would see there if you change the theme is all of the internal links are suddenly different, as well. So maybe you have a lot of internal links in the menu on the side, and with the new theme you just have a smaller set of internal links or related topics that are on the bottom of the page. Maybe you have more or less of that. And so that's something where if you change the theme of the site, then all of those internal links can theoretically change. And that can definitely have an effect on how we crawl and index the site


Question 16:43 - So, basically, the site that I defined open blocks scenario, is that OK with Google having that type of layout? Because right now it's ranking and I don't see a problem, but I think there is more potential for it to rank. I think for some reason Google is holding-- because I've tested many different type of websites and it's the same amount of link power, it's not getting the same amount of results.

Answer 17:16 - I don't know. I think that's always hard to say what kind of potential is of a website because that depends a lot also on the general niche that it's active in, if it's very competitive, then even with the same website that might work really well.  So that's really hard to say.

Question 17:42 - OK. Second question. It will be my last, hopefully. Topic variation on a field website. So if I'm talking about best of kitchen stuff and then I want to talk about best of living products, how much variation can I go? I mean, I remember seeing Matt Cutts' blog post about health and how important that is. And I greatly appreciate that.  Like radiology and those topics are-- they are health-related and they need to be provided to users with great integrity. I mean, if I'm putting up products about living room and they're to help you with anxiety, would that go into a health-- I mean, it is an affiliate product line, but would that become a problem? And would I have an issue with that?

Answer 18:51 - I-- it's hard to say. It sounds like you're headed in that direction if you're trying to position your website as-- I don't know, interior design specifically for certain kinds of health issues. And you're kind of headed in that direction, where if you want people to take you seriously about your input on health issues, then maybe you need to position yourself as actually being serious about these. Not just randomly providing other things in there, as well.

Question 19:30 - Let me-- I appreciate that, and that's exactly what we're doing. But I wanted to make sure before I jump in to that scenario I'm fully prepared on that. I'm not going to get penalized because I put out something that's against Google's policy.

Answer  19:47 - Yep. So I wouldn't see that as something where we would have a manual action on a website because of that. So kind of the traditional penalty situation, that doesn't sound like that. The penalty is mostly apply specifically to the webmaster guidelines. And I think what you're headed into is more around kind of the general quality and how we think the website is relevant for these specific search results. And for that, there's a very broad spectrum available. So it's not that we would penalize a website for having a certain mix of topics.

Question 20:34  - Can I ask a follow-up question on that? Around where, as well as changing content, you're also changing the domain. So we've got a good client at the moment and a big retail company who have also got the services arm and they've got two separate websites for retail and services. They want to amalgamate them both under the retail domain effectively. The services site ranks really well and they want to sort of shift it as effectively as they're sort of going under a single brand in a single domain. It is-- what does Google see as sort of the most important? Is it the content? Is it going to be how they structure the incorporation of the services site within the retail site? Obviously, we're going to be doing 301 from the existing site to the new one, et cetera. But where should we be looking to concentrate the efforts to make sure that we don't suffer too much of a drop?

Answer 21:37 - So I guess what I would focus on there, primarily, is to make sure, from a technical point of view, that you're really kind of putting things together in the right way. In particular, when you're merging sites like this, you would expect to see some drop in search because we can't just take the two sites, and add them up, and use that as a signal. We, essentially, have to reprocess everything and look at the new site the way it is. So, to make sure that that works, optimally, I would really work hard to make sure that, from a technical point of view, you're doing everything you can to redirect pages on a one-by-one level to the new versions and that you're not blocking anything from being crawled or being indexed, in the sense, that if it was indexable in the old site, that it should be indexable in the new site. So I would assume, from the content point of view, you're probably covered. But, from a technical point of view, a lot of the small redirects sometimes get lost. And that makes it harder for us to really combine all of those signals.


Question 22:45 - Where you've got a site with-- so these sites have got a lot of pages. Can we adopt a sort of triaging approach to the redirect-- so going for the priority, high-level category pages first and then working our way down into the lower-level pages?

Answer 23:02 - I think that would make sense, but I would try to use tools as much as possible to really get the broad coverage. You're really sure that everything is properly moved over.

Question 23:28 - If a web page mentions your brand but doesn't actually link to your website, would that count as a positive signal?

Answer 23:37 -  So I mean, we would pick that up as a mention on another web site, but it's not a link. It doesn't pass any page rank. So it's kind of not really the same thing as a link.

Question 23:52 - I've noticed our app spot staging versions are showing up as backlinks to our site. Do you think this is an issue? Should we disavow appspot.com?

Answer 24:02 -  You don't need to disavow appspot.com. If that's your staging site, you don't need to disavow those links. But it does sound like we're maybe indexing your staging site, which probably you don't want to have happen. I've seen lots of weird problems come up when that happens. So I'd recommend blocking that in whatever way works best for you, which could even be blocking it by user agent if you can do that or blocking it by robots.txt in a worst case so that it doesn't get indexed there.  

Question 24:36 - What's the correct way to use Breadcrumbs from an SEO point of view? So Home, Category, Item or Home, Tag, Item?

Answer 24:44 -  Both of these could work. That's, essentially, up to you how you want to structure your website, how you want people to navigate through your website.  

Question 24:56 - Do search suggest queries include voice queries, then via Voice Search, Google Home, and Google Assistant?

Answer 25:04 - I don't know. I don't think so, but I don't know. Any of you guys know? [LAUGHTER]  Yeah. I think a lot of the voice search questions and the queries that come through Assistant, it's probably still something where people are working on to see how it ends up working in the long run. It's also something from the Search Console side. We're looking into how we can show that in Search Console. But it's kind of tricky because we don't really have a search results page. When you ask something, and Assistant gives you an answer, like, based on this website, this is the answer, how do you best show that in Search Console so that it's still useful?

Question 25:56 -  How do I redirect pagination from an old website to a new one?

Answer 26:04 -  Kind of similar to before, you wouldn't really need to look into the type of pagination or what's happening there, but rather try to redirect as much as possible on a one-by-one basis.

Question 26:26 - Would it be best to noindex PDFs that have the same content as their website?

Answer 26:22 - Maybe. I think that's kind of up to you if you want to have these PDFs indexed or not. For some websites, it makes sense to have PDFs indexed. For others, maybe less so. I would look into how users are using those PDFs.  

Question 26:45 - We're currently using Google Tag Manager for structured data. Is that OK or not--  kind of simplifying the question. [LAUGHING]

Answer 26:58 - It's OK. But I think what you need to keep in mind is that the Tag Manager needs to be run by JavaScript. So it only works when we can render your pages, which also means that there is sometimes a delay between when we pick up your page, and when we process the JavaScript, and render the pages, and are able to pull out the structured data with the Tag Manager. So that's one thing to keep in mind in that there's a bit of a delay there. The other thing is that you're adding quite a bit of complexity for just a few snippets of JavaScript that you're injecting into your pages, which means things can go wrong along the way. And you might not necessarily notice it when you're looking at those web pages. So that's something where I would recommend, if at all possible, try to get the structured data directly into the pages so that it's a lot easier to diagnose when things go wrong, when things break, and to fix them early on.

Question 28:00 - Speakable structured data, is there a time-frame when it's coming outside of the US?

Answer 28:06 - I don't know. A lot of times, we test it on the US market probably because we have a lot of people in the US, but also because maybe they like to try stuff out. And if it works out there, then usually we end up rolling that out to other countries and other languages. With speakable markup, maybe it's also a matter of being able to understand that and being sure that the quality of this content is OK. So what I would recommend there is to implement it on your pages and kind of be ready for when it does roll out for other countries.  

Question 28:46 - If we ignore manual actions, what will happen? [LAUGHTER]

Answer 28:50 - That doesn't sound like a particularly good strategy, I think. So manual actions are done from the Web Spam team. When they see something significantly wrong that they feel they need to manually fix that, so if you don't fix those issues, then those issues will remain. And that manual action will remain. And whatever that manual action does will kind of persist. So these manual actions do expire, at some point. But we're talking more about order of years, rather than weeks or anything like that. So if you want to wait out a manual action for a couple of years, then that's your decision but probably not a good short-term one. So I personally recommend taking these into account, and taking them as feedback, and fixing them, and doing the reconsideration.  I mean, if your website isn't meant to be shown and searched, then maybe you don't care about manual actions. If you're just trying to do do ad landing pages, then maybe that's fine.

Question 30:09 - International SEO - At the moment, some of our page titles in the US, Germany, and other countries have southern Africa added to the end of the title tag. Southern Africa is shown in the search results, an it looks like people aren't clicking. We have the page title, hreflang, and international targeting set. What could be happening here?

Answer 30:35 - So I'd recommend maybe posting this in the Help forum and, ideally, sending me a link. You can drop it in the Google+ post for this event. And I can take a look there. What might be happening here is that, for some reason, we think, at least for a part of your website, southern Africa is kind of a site title type thing. And that's why we're adding it to these pages. Usually, that's more a sign that something with your site structure isn't right, where it looks to us that southern Africa is really a site title. And that might be with the URL structure, with internal linking, those kind of things. Might be something weird.

Question 31:28 - I have a about personalization because our plan as a publisher is to personalize the whole homepage for every cookie. So every user will be targeted by interest profiles and stuff we have. And then we will show something. And, in one scenario, we will hide all the articles a user already read. So we have to do cloaking, I guess. So what's your idea how to handle Googlebot in this scenario?


Answer 32:03  - So the easy part is Googlebot doesn't keep a cookie. So that might make it a little bit easier. In general, personalization is something you can do if you want to do that. It's not something we would see as cloaking, especially if you've recognized that it's a repeat visitor, if they're logged in or not. I think that's totally up to you. The important part for us is that, from a general point of view, the content is equivalent. So I think this would primarily affect things like the Home pages, Category pages, maybe. If we still see those as Category pages, as Home pages for your website that lead to the rest of your website, then that's not something we would worry about. If the individual article pages also changed significantly, like someone clicks on an article-- I don't know-- about Christmas, and they end up on an article about Easter, then that would be the kind of thing where we'd say, OK. The user is being misled. But if they're searching for sporting events, and they land on your sporting event site, and that page is personalized for them, that's perfectly fine.

Question 33:20 - Yeah. The biggest problem is probably the Home page because this can really be completely different. If we see that the user is interested in sport, then we would probably show 80% of the stuff as sport. And, for the other one, it will be economy news or something.

Answer  33:32 - Yeah. That's totally up to you. That's something where you have to work with your users to find the right balance. But, from our point of view, people would be going to your Home page because they're kind of searching for your brand or for general swiss news. And they go to Home page. And then your home page is still relevant even if it has different content there. I think, especially for news sites, that kind of change happens naturally, anyway. We crawl it at one point, and we seen this collection of articles. The user goes there. There's a different collection because things have changed. And that's kind of normal.

Question 34:16 - We had a massive drop in really important index sites. And these are highly individual and unique pages. They are our regional category pages, like Wine Tasting in Munich and Wine Tasting in Hamburg. And then I looked at Search Console, and I saw that these pages were excluded and dropped from the index because Google says they are duplicate, and the submitted URL is not selected as canonical and, instead, another one. But the problem is that the Munich page suddenly seems to be identified as a duplicate of another city page. And these pages are really unique and have individual content and individual links. And the events on them are in these cities, so there is actually no way of confusing them. So I don't really understand what's happening there. And we do have individual site maps for each store. And we use rel=canonical and do put self-referencing canonicals on each page. And the content is highly individual, so I don't understand what's happening here.

Answer 35:56 - So, usually, this happens when we think that a part of your site is duplicated, so kind of like how you saw. And it's not always the case that we look at the content and say, oh, the content is exactly the same. But sometimes the case that we look at the URL structure, and we see maybe the city name is-- I don't know-- a URL parameter, and you can enter any kind of word as this URL parameter and still get one of those cities, then we might decide that maybe this URL parameter is irrelevant-- that it's not necessary for determining the content on the page. So that's kind of one place where this could be coming from. It could also be that, maybe for some cities, you're showing the same content. So, for example, if there are cities that are right next to each other, maybe you show the same content because maybe the events are the same for these two locations. And that might be confusing us. So what I would do there is double-check to make sure that the difference that you have between the individual locations is really clear, that the URLs are such that, when you enter an invalid city name, that you show a 404 page, instead of picking another city, and kind of making sure that, for us, it's really clear that these parameters, or the path, or whatever you're using for that is really always leading to a unique, different page.

Question 37:12 - This is what we do, in fact. So we use speaking URLs, and these pages were identified. This only happened some weeks ago. And before that, Google didn't seem to have a problem with these pages. And we have speaking URLs, and the cities are not closely connected, like Munich is not really close to Hamburg. And the content on these pages is really, highly different. So I don't understand how this can be confused.

Answer 37:45 - OK. Maybe you can send me some example URLs. Or if you have a thread in the Help forum with the example URLs, making drop that into the Google+ comments, and I can take a look at that with the team because that is something that the team does care about to figure out where we fold things together properly and where we shouldn't. Sometimes we get that wrong.

Question 38:21 -  A question for Martin regarding single-page applications. Yeah. It's just approximately. When do you think Google will be able to render a client side website completely with the first Crawl. For example, there is no href tag anymore on the source code, so that SEOs don't have to bother the it anymore.

Answer 38:43 MARTIN -  Right. So we are working right now, as we speak, on getting rendering and crawling better integrated. Everything moves a little closer together. I can't give you a timeline on that. It's a work in progress. We also try to improve rendering, to begin with, because, if you ask, can you index single-page applications, yes. We can do that today. And the rendering gap is not always as big as people think. So I wouldn't worry too much about it. The problem is that the client side rendering also influences how quickly we can que things for crawling. So the crawl budget plus the rendering is something that is really hard to spot from the outside. So you only see, like, oh, my page takes a day, or two, or three to be indexed. That doesn't necessarily mean that it's a rendering issue. It might be a crawl budget issue. It's a tricky one to figure out. I've seen that a few times now. And, even though that's an issue that we understand, it might take a while for us to bring things closer together. So, for the time being, if you really care about frequently changing content and with lots of frequently-changing content, I would still recommend dynamic rendering. That's not something that's going to pop and disappear in a month or so. We are working on improving it. It's going to take a while. I'm not going to give you a timeline.

JOHN - Yeah. I think, especially if you have a website that has a lot of content, you should really be using dynamic rendering or server side rendering. I also just today saw that there was a blog post from Bing a few months back that they were also saying use dynamic rendering for things like that. So I'd really kind of invest in that direction. And, like Martin mentioned, it's not going to go away. So social media sites-- also, if you want to share things, or if you want people to share a URL on WhatsApp, then it also needs to be pre-rendered for that.

MARTIN - If you have the engineering resources, though, I would highly recommend looking into server-side rendering or, even better, hybrid rendering because that yields user experience benefits, as well. Because you, basically, shed the critical content in the first go to your user, and then you enhance it with JavaScript to bring the single-page application experience on top of it. So you get the benefits of dynamic rendering, but you also get user experience experience benefit. And things are appearing faster, especially on Mobile. But if you can't, or if you don't want to gamble the engineering resources, then dynamic rendering is a really good idea.

Question 41:44 - Our company recently moved, and the road our building is located on is not properly mapped on Google Maps. When I try to upload the listing, it changes the street from Way to Road. We submitted feedback through Google Maps. Is there any other way to get that fixed?

Answer 42:00 - Probably, you would need to go through Google Maps. But I don't really know everything around Google Maps. So I would, perhaps, post in the Help form about this. The times where I've had to get something fixed in Google Maps have usually been really quick. And they give a lot of feedback. So maybe that's fixed in the meantime.

Question 42:28 - I submitted some DMCA take-downs and was asked to do it on a per-page basis, but there there's so many pages. What can we do then?

Answer 42:35 -  So DMCA is probably the best approach here. And it is on a per-page basis, so there's not really something I can do to take that away from you. That's kind of the legal process, in general, for situations like this.

Question 42:57 - What's the max size of an .htaccess file? We have a lot of redirects.

Answer 43:03 -  I think this depends on your server configuration, but you can also use things like regular expressions to simplify these files.

Question 43:11- Would making topic categories within blog content that mimics the core web site's product category be a good idea?

Answer 43:21 - Totally up to you. You can do that if you want. If that works for you, then go for it. If it confuses users, maybe pick something different.  

Question 43:35 - Can you analyze the issue from my website? My website got a manual action for spammy structured data.

Answer 43:45 -  What I would do here is post in the Webmaster Help forum with that because sometimes the spammy structured data issues are a bit tricky to diagnose. And sometimes it takes some help from other people to figure out which direction you should be going.  

Question 44:02 -  I have a website that lists local businesses together with user-provided reviews. Sometimes we have-- what is it, really large JSON-LD snippets, over 100 kilobytes, because of all of the reviews. Is there a limit on the number of items? Is it OK to just include a few of the reviews?

Answer 44:25 - I don't know about a limit on the number of items. But 100 kilobytes for JSON-LD snippets seems excessive. So I would look into finding ways to just paginate these review pages somehow, rather than trying to include everything on one single page.

Question 44:36 - Private membership websites-- has a lot of unique content that's limited. But someone goes off, then copies it, and publishes it. What can we do?

Answer 44:56 - I think, first off, you probably need to look into whatever normal legal approaches that you can take there, which might include the DMCA. But, from a web search point of view, we don't know where this content is coming from if we can't crawl it. So there's not really much that we can do on a web search site. It's more something that you'd need to resolve amongst each other directly.  

Question 44:23- We're working with a health care group, multiple primary care locations. When creating content for each, we're attempting to concentrate on each of the locations' unique services.  How does Google view this in differentiating health care service locations for search?

Answer 45:45 - I think that's fine. If you have different locations, and if they're unique in their own ways, and you have content for that, that seems to be a good fit.  

Question 46:29 -  I don’t think you answered my question regarding changing themes. You said if I were to maintain all the existing internal link structure on the new theme but the HTML would change, would that have an effect? Or is the HTML having the effect on the overall ranking, or is the combined HTML, and the anchor text or the link structure?

Answer 46:58 - Yeah. That's a combination of both of them. It's something where the theme change-- it can just be a matter of changing the design. If just the CSS changes, then the HTML is the same, then probably that would be pretty non-issue. Probably not much would change there. But if the HTML changes, as well, then you could imagine things like the headings on the page being marked up with just bold versus marked up as a heading. And those kind of differences can play a role in how we evaluate a page. So if you're changing the HTML significantly, then that could affect how we rank those pages. I mean, it could also affect it in a positive way. So it's not always that things will go down. But if you improve things, then that's good for you, too.

Question 50:17 - How long will it take, I think, when we fix a manual action?

Answer 50:23 - So, usually, that needs to be processed manually because it's a manual action. So someone needs to take a look at that and review the changes that you've done. And, usually, you get a reply back within about a week or two. So if you fix these issues, then I would go ahead and submit the reconsideration request. And, usually, within about a week or two, you should have an answer there, with regards to yes, this was OK. We will resolve this manual action. Or no, there are still issues here. Sometimes, depending on the manual action, it can also be that we see you fixed a lot of this, but you haven't fixed everything. So we will reduce the scope of the manual action. With structured data, that might be a bit trickier.  

Question 51:18 - Let's see, another one about internal links and backlinks, whether Google has the same page rank recipe.

Answer 51:30 -  I don't know, like how we calculate the page rank. But, usually, it's not something that sites really need to worry about. So if you need to figure out the details of how page rank calculations work, then maybe you need to work at Google to get all of those details. Because, I mean, we don't show page rank externally anywhere. There's no toolbar that shows it. So it's kind of a theoretical question.  

Question 52:03 -  How can we ask for reconsideration?

Answer 52:06 - That's within Search Console. Where you see the manual actions, you can submit a reconsideration request. It's fine to do multiple of these if you're working on a problem, and you're trying to resolve it. On the other hand, if we see that you're just going back and forth, and fixing things, and getting manual action again, then kind of moving things, trying things back and forth, then it might be that the Web Spam team says, OK. They're just toying with us. We will ignore their reconsideration request for a while or kind of stop them from submitting more of them. So I would see this as something where you really try to fix the problem, do the reconsideration, and react based on the feedback there. I would also get help from the Webmaster forum if you're not sure about what exactly you should be fixing because, a lot of these issues, you're not the first one to run into this problem. And people in the forums, they have practice with this. So I'd definitely get help.


Question 53:12 - I have a question about the structured data, called Carousel. And I just see these for recipes as of now. Is it planned to have it for other types? So is it a good idea, for example, to go for, if we have a tag page for a celebrity, to add articles about a celebrity into this Carousel, and maybe we will get not just the general information about the celebrity, but we can also offer articles below as a Carousel?

Answer 53:45 - I don't know. I mean, it's kind of you're making a bet that this will happen. So I can't say.

Question 53:54 - I only see it for recipes. So, in the documentation, it was not mentioned as just be exclusive for recipes, so--

Answer 54:00 - Yeah. I could see it being expanded over time. But I think, from a practical point of view, you, as someone who is working on the site, would have to make that decision and say, well, I believe, in the future, this will happen. And we will be prepared. But you have to know that maybe it doesn't happen. It's kind of a bet that you'd have to take on your site. And maybe it's just minimal amount of work to add that, so it might be worth just doing it if it's easy enough. But if it's a lot of work, and Google doesn't support it at the moment, then I don't know if that's the best use of the time.


Question 55:17 - The update that Google had in August, September, and trickled down in October, some sites had ghost effect of links. I'm still seeing sites that are using the same techniques and ranking. However, they've changed a little bit of a strategy because, I guess, I can tell how Google picked up those strategy. And the other sites are using that strategy with the bit of different technique. Is there more to expect from Google on this? Because it's not fair for sites that are ranking with the proper outreach and with white hat links, I would call them. Those are gray hat links. And they're excelling with-- they get one link, it could be equivalent to 100 links that we may have. So it's not really fair for working out 100 links or trying to get 100 links to a page, and somebody just puts one, and breaks it for you.

Answer 56:34  - I'm not sure what, specifically, you're referring to, so it's kind of hard to say. In general, if you see sites doing things that are against the Webmaster guidelines, feel free to send us spam reports. We really appreciate those. If you're seeing bigger scale things that are hard to report, you can also send those to me directly, and I can pass those on to the team. I would just post it in the Google+ comment, and then I can pick it up there.


Question 57:14 - I've seen sites that are, within a month or two, from less than a thousand to a million traffic on an auction domain. It's ridiculous.

Answer 57:28 - Yeah. OK. So like an expired domain type thing?  OK Yeah, those are always interesting for the Web Spam team. So if you have some examples, they like looking at that. It's not so much a matter of us taking these examples and saying, oh, this one specific site is doing something wrong. But rather taking those and saying, well, this might be a general problem. And we might be able to recognize this better on a broad scope with help from some examples. So that's something where we try, from a Web Spam point of view, not to focus too much on individual URLs, but rather try to figure out what kind of is the system behind this so that we cover all of these cases, rather than just picking that one specific one out.