I'm excited to announce that we will once again be reporting in detail on Google's Webmaster Help hangouts. I stopped doing this in 2014 as I got far too busy with consulting work. However, now that we have a team of fantastic SEOs here at Marie Haynes Consulting, we are going to start this up again.

In each post, you'll see a summary of the most important issues raised. At the bottom, will be close to a full transcript of the hangouts.

Without further adieu, here is what we can learn from the latest help hangout with John Mueller. We have listed in order of usefulness.


 

What's the deal with "discovered, not indexed" pages in GSC?

In this question, the site owner has 99% of their pages reported as "discovered, not indexed" in GSC. John said that this "sounds like [Google is] seeing a high number of pages in their system and they are not interested in indexing all of these." He said that sometimes this can be a result of poor internal linking. For example, a page may be discovered because it's in your sitemap, but if it is not linked to internally, Google may be less likely to want to index it.

He also said that if 99% of the pages are being excluded, to be sure that they are not autogenerating URLs with parameters as this could make Google think they are containing duplicate content.

He mentioned that there is no simple technical test you can do to determine whether a page is of high enough quality to index. He recommends looking at each of these pages individually.

 

Should we be paying attention to E-A-T?

This question was an interesting one as we have been paying a lot of attention to Expertise, Authoritativeness and Trust here at MHC.

John mentioned that E-A-T comes from Google's Quality Raters' Guidelines. He said, "It's very useful to look at these guidelines in terms of the direction that we want search to go." But he also pointed out that it's not like they take the information in the QRG and turn them directly into code that matches them exactly. He said that we should be looking at the bigger picture saying, "For example, I know Google looks at expertise, authority, and trust so maybe I should work on these things because I am missing some things that signal that on my website."

We have had really good success in helping site owners see traffic improvements by improving how E-A-T is displayed on their website and also, by improving off site E-A-T factors as well (such as getting more authoritative mentions.) If you are a paid member of my newsletter, you have access to the Quality Raters' Guidelines checklist that my team and I use to assess things like E-A-T.

 

Should you make expired product pages 404?

In this question, the site owner has product pages that they have serve a 404 error when the product is not in stock due to the season. Then, when they do have the product in stock, they remove the 404. John said that this is a tricky situation. He recommends improving this situation by adding appropriate internal linking. Or, he said that if you have a lot of seasonal content, then what might make sense is to have one page for seasonal content, and switch out the content depending on what season it is.

 

Will a large number of 404 pages impact rankings?

John says no.

 

Seeing "temporarily unreachable" on fetch and render?

We get this quite often! John said that he sometimes sees this as a common question in Google's help forums. He said that when Google fetches a page, they have a limited time to fetch all of the resources. He recommended looking up information on crawl budget and making your server faster.

 

Interesting Question about a site that lost rankings after https migration

It is not uncommon for a site to see a dip in rankings after switching to https. In this question, the site owner mentioned that all of these losses were regained (after two months) with the November 16, 2018 algorithm update. John mentioned that this could just be normal, or it could be that there were search quality changes that happened at the same time. He said, "We made big updates to the core algorithm in the last couple of months."

 

Are tag pages all low quality?

If you use tag pages in WordPress, they can often produce a lot of thin content in the search results. John said that sometimes tag pages can be good. He suggests manually reviewing these pages and determining whether it makes sense for them to be in the index.

 

If you're removing a lot of old thin pages, how long before Google recognizes this?

John says it can take a long time for Google to recognize this...even as long as three to six months. He did say that if people are clicking on these links from the search results, that you could use the URL removal tool in GSC. However, there is a limit to how many urls can be removed using this tool.

John also mentioned that while this content is in the index, it may be causing Google to overlook the high quality content on your site.

In answering another question in this hangout, John said that you can temporarily add these pages to your sitemap so that they potentially get recrawled by Google faster. But, be sure to remove them later.

 

Should your hreflang point to your m dot version of pages (assuming you have m dot)?

John says yes.

Seeing "uncommon downloads"?

Sometimes in Search Console you can get a message saying that there is a security problem for "harmful content". John mentioned that sometimes Google will label a site as having "uncommon downloads" if there are files that are unique each time a user downloads them. It doesn't always indicate a problem.

The new PageSpeed tool uses Lighthouse tests

John says that The new PageSpeed tool includes information from Lighthouse.


If you like stuff like this, you'll love my newsletter!

My team and I report every week on the latest Google algorithm updates, news, and SEO tips.


Full transcript

 

Question 0:30 -  I have a question regarding the new PageSpeed tool. We’ve begun to check the our pages with the tool daily. We’ve noticed that in the morning we’ll often get lazy loading warnings from the tool but in the evening this usually disappears. We were wondering if this is an issue with the tool itself or if it could be related to some of our digital assets.

 

Answer 1:09 - It’s hard to say what might actually be happening there. Normally, what we try and do with these test is to keep them as stable as possible. Rather than this being an issue with the PageSpeed team changing their mind, I think it is likely an issue with your page being on the edge there where sometimes can be enough to trigger it and sometimes not. So whether you should fix this depends on the issue itself.

 

I think the new PageSpeed tool is pretty neat because it includes the Lighthouse tests and it runs them for you which saves to the question of whether these tests would be the same on other devices as well. So I think that’s a pretty neat idea to make a test which covers pretty much everything that is there. You can still run the Lighthouse tests natively in Chrome if you want to check to see where exactly these problems are happening or if you want to tweak things on the page to see how that works in regards to things that are flagged.

 

Question 3:14 - If you have a lot of subject pages, because the old tag structure said that if you had a lot of the same word then a lot of times then a new page will be automatically generated for that subject. If you have a lot of those pages would it ruin the whole site?

 

Answer 3:43 - Sometimes they can be okay and sometimes they are low quality. The best thing to do is to look at these like normal pages and figure out whether or not you would like to have these shown in search, maybe no index then. Is there a way for you to determine whether you have some good tag pages and some not so good tag pages? That could be useful! I generally see these like any other pages on your website. So figure out whether you want to index them or not.

 

Question 4:35 - On the new Search Console I can see if I filter a query one-by-one I will be able to see this query and all its metrics. I will also also be able to see the pages that Google has shown for this query. What is the consideration made by Google to show these multiple pages as landing pages and what can I do to optimize so Google only shows the one that we want.

 

Answer 5:10 - In search console there is a difference between how you look at those pages in search console in regards to the performance report. If you look by query then we show all individual pages that were shown by that query while if you search by page then we only show the information for that specific pages. For individual queries multiple pages belonging to you could be shown but we only pick the top ones. So if you compare those number by query and by page you’ll see that there is a difference. As for having Google show certain pages for certain queries that is not something that you can easily do. There is no mechanism on our side that will facilitate that. It’s our algorithms that try and figure out what is most relevant for specific queries. If there is a page that you do not want to be shown then you should think about how you can make it more relevant. This can be done in regards to the content on the page. So you could clarify  where you want users to go for that specific query. Internal linking can help reinforce where exactly the important pages on your site are. Try and make it easier for users to get to those pages and Google will pick up on that as well.

 

Question 7:42 - Is there a way to input a group of queries to be shown in search analytics?

 

Answer 7:50 - Not that easily. What you can do is enter a part of a query and we’ll show everything that matches that query. For example, if you have one keyword and variations of that keywords. You can enter the one keyword in search analytics and it will show you the variations. What you can also do is import the table to Google Spreadsheets or Excel and then use a filter in there to refine things in there in a way you want them to be visible.

 

Question 8:30 - I recently watched a November 13th video about the 200 ranking factors. Do you know where I can find a list of the 200 ranking factors?

 

Answer 8:43 - I don’t have a list of all of them. I think it’s also something that in general it’s not worthwhile to focus on too much. If you are looking at individual ranking factors then you easily lose sight of the bigger picture of where Google wants to go with regards to search - which is where people make high quality website that are relevant for the user. Not where people are looking at “this meta tag here 3 times on a page… etc” which isn’t something that actually makes the page relevant for a user. Focus more on the long term goal because if you focus on that then you don’t have to worry about small fluctuations and the individual ranking factors.

 

Question 9:35 - I noticed that a page on my site has a new backlink on it. And it now bounces around on the second and third page ((of search results?)). I assume this is normal. If it is normal, how long does it take until Google knows that it is a good link and I need to disavow it.

 

Answer 10:08 - Generally it’s a good sign if people are linking to your site! If this is a link that you did not place there yourself then you shouldn’t worry about it. This will happen more and more as time goes on. Most websites don’t ever have to use the disavow file.

 

Question 11.22 - In Search Console 99% of our pages are excluded with discovered but currently not indexed. It has been like this for a couple of years even though we have links from some larger newspapers. What could be causing this and what could I do to get these pages indexed. Is it possible that Google is not able to crawl the high number of pages?

 

Answer 11:48 - Sounds like we are seeing a high number of pages and our system is just not interested in indexing all of these where they think “maybe it’s not worthwhile to crawl and index all of these”. That means we know about the page through a sitemap or internal linking but it’s not worth the effort, at least at the moment, to crawl and index this particular page. Especially if you are looking at a site with a high number of pages this could be as simple as internal linking not being that fantastic or the content on your site not being seen as that critical to search results. If it’s really 99% of the pages you should maybe look at some of the technical things as well in particular make sure that you aren’t autogenerating URLs with different parameters which can make Google think that there is duplicate content. Make sure to check your internal linking, a crawler tool can help you with this. If you are able to reduce pages and combine content this can help you reduce duplication and improve content strength.

 

Question 16:16 - When you have a large number of pages that are discovered but not indexed is there a good way to determine whether they deserve to be indexed by Google or should be removed?

 

Answer 16:50 - There is no simple technical test that you can do. This depends on you looking at the pages with your own experience. Take a look at if they all individually deserve to be indexed or if they can be combined and improved to strengthen the content and reduce the number of URLs that you have. Finding the middle ground is important here but it is also something that can change over time.

 

Question 19:02 - After setting up the security certificate for the site we noticed that we lost all the ranking for most of our keywords after a month. Today when I checked I saw that we got back the ranking for some of our keywords. For two months we didn’t receive any organic traffic though. What is the issue?

 

Answer 20:10 - I don’t know! One of the things that’s a bit tricky is that sometimes there are technical issues after a move to HTTPS. It could also be that there were some search quality ranking changes that took place at the same time. We made big updates to the core algorithm in the last couple months. These changes often affect sites different. That can make it hard to determine what exactly the issue was. Google has a lot of experience with Http to Https migrations so it is unlikely that any drop in traffic in that is related to anything on Google’s end.

 

Question 23:19 - The trust project, and intention consortium of news organizations, say that that Google, Facebook, Bing etc. use the the indicators of the trust project and their associated tags as part of their algorithms or part of their trust factors. So is being part of the trust project a ranking factor for news or is it possible that it will in the future?

 

Answer 23:48 - I don’t know. In general I suspect that these memberships are not something that we would use as a direct ranking factor. I suspect that you have more of an indirect effect here in that of other people trust your content more because they realize that you’re actually taking part in these organizations and these organizations are ones that don’t just accept anyone who wants to join. Then that’s something where you probably see an indirect effect there. I don’t know if we would use something like this as a direct ranking factor as it can be a little bit tricky. As for if it will be a ranking factor in the future, I think that kind of goes in the same direction. While being part of these organizations is generally good for the ecosystem going forward I don’t think that they actually do anything to make the content more relevant.

 

Question 25:19 - 1 in 3 of my fetch and render request in search console turns a temporarily unreachable error. I can’t see this effect in my logs. Are you aware of any issue with the fetch and render tool?

 

Answer 25:32 - I’m not aware of any issues that could be doing this. These are issues that we sometimes see from users in the forums. When you are rendering a page with the fetch and render tool what we try to is fetch as many of the resource as possible for the rendering. For that we have some fairly strict deadlines for how quickly we need to have the content back. That’s something where if some of this content can’t come back in time then it can be seen as temporarily unreachable. This is unique to the tool and not something that is generally reflected in search. What you can do is look up some information on crawl budget (make your server faster) or you can think about how many embedded URLs are needed to render a page (we suggest reducing this to improve page performance). With things like multiple CSS files, you can see if you can compile them all into one CSS file, which you can do similarly with something like Javascript or tracking systems.

 

Question 29:33 - Page load/download times have increased a lot and we’ve noticed the crawled pages have dropped significantly. Can this affect how often Google crawls and caches my page which and ultimately reduce ranking?

 

Answer 30:14 - It’s important to note that crawling does not mean ranking. If we can’t crawl and page now but we can do it tomorrow, that doesn’t mean that we see the quality of the page any differently. So just because we aren’t crawling a page as often as before, doesn’t mean that we will rank it any differently. The one aspect that may come into play as to the crawl rate is that if you have new content that you put on your website and we haven’t gotten around to crawling it yet. You do not need to artificially push the redrawing of content that you have. In regards to time downloaded, I believe that’s in the stats in the search console. If it does take us a long time to crawl a site then we will generally crawl fewer URLs every day because we do not want to overload your server. It may be worthwhile to implement CDN or caching to make it easier for users and GoogleBot to get a cached version of those pages. If you website is changing infrequently, then you shouldn’t have to worry about being crawled frequently.

 

Question 33:00 - We have a client who seasonally adds and removes pages from their site. So when they take it off you receive a 404 error. When the holidays come around again they will put the page back up. How does this affect ranking and indexing?

 

Answer 34:00 - I’m not so sure what the best thing to do is in this situation. This can be hard since our crawlers have to keep reevaluating whether or not this is valuable content. What you can do though is do lots of internal linking to that page when the holiday season is coming around. That way, when we do crawl your site we can see that the page is extremely valuable since it has many internal links pointing to it. If you have a lot of seasonal content (Easter, Christmas, Thanksgiving, etc) then it may make sense to have one page for seasonal content where you swap out the content depending on what season it is. That can make it easier to recognize that this page is important, especially since that one URL would be able to collect more links over a period of time. That can help us understand that your page is valuable over time even if you are changing the content every couple months, just not as long as you are changing it every day.

 

Question 36:08 - We currently serve a country selector overlay on our website. The GoogleBot render preview isn’t showing anything below the fold, it’s not able to load. Is GoogleBot able to crawl the content?

 

Answer: It really depends how you implement this. For this reason we normally suggest implementing a banner instead of an overlay. That way users can still see the content when they enter the page and they can enter the locale of their choosing. You are able to do it more forcefully but it’s very difficult to implement so it works well for search and for users.

 

Question 39:45 - Can the sudden rise of 404 codes (1000’s) generated by a CMS code cause the ranking to drop. These pages were never meant to be there.

 

Answer 40:05 - If these pages were never meant to be there and there was never anything on them then that is perfectly fine! We see 404 errors all the time. It’s not a matter of us seeing 404 errors as a bad thing. In fact, seeing 404 errors is a sign that you are doing something right technically in that when a page doesn’t exist anymore that you give us an error code. That is the right thing to do! I’d look at the code itself though and try and sort out what is wrong. You should make sure that there isn’t anything on your end where you are accidentally leading people to a 404 page.

 

Question 41:55 - In E-A-T is there any percentage that divides expertise, authority, and trust. Example: E = 50% A= 25% and T =. 25%.

 

Answer 42:10 - No. In general, E-A-T- comes from the QRG. It’s very useful to look at these guidelines in terms of the direction that we want search to go. It’s important to realize that the QRG are just guidelines we give our quality raters. It’s not the case that we take the QRG and turn it into code. It is not a 1 to 1 ranking factor. Rather than working on the small details in the guide you should look at the bigger picture. For example: “I know Google looks at expertise, authority, and trust so maybe I should work on those things because I am missing some things that signal that on my website”. These are things I will take as general feedback to improve a website, but not something that you would implement 1 to 1 to rank a website.

 

Question 43:50 - Is the item type supported by Google for reviews and rating? The review guidelines mention services but the supported item types do not mention services.

 

Answer 44:03 - I would see the supported item types in the developer side as the authority as the type of markup or kinds of objects that you can use this type of markup for. We work very hard to make sure that the documentation on the structured data side, especially in the developer information, is up to date. I realize we support a lot of different types of structured data and it can be hard to which types apply to your case. In general we try to be as explicit as possible as to what we support.

 

Question 46:25 - My website has a lot of 404 errors through a plugin. Now whenever a user visits my website they get a 404 error.

 

Answer 46:40 - This is a scenario where it is useful to look at the sources of the 404 errors. So if we’ve been indexing pages and they are returning a 404 error then that’s probably a bad thing. In this situation I would try and find a pattern for those pages that are currently indexed and sending users a 404 error, then set up redirects so that you can connect users to your actual content.

 

Question 47:12 - In search console when checking the status of a resource for a site it displays a message about a security problem for “harmful content”.  At the same time in the section URL of the page with the problems information is missing the last time it was detected. Checking the domain with the transparency report safe browsing page also shows nothing. After sending a request for a check the measure was taken manually the security problem I removed. But after a few days are displayed again in the account.

 

Answer 47:55 - Feels like this is something that we need to take a look at individually. If you have a thread in the forum we can take a look at that. Looking at the screenshots, the problem that is displayed there is “uncommon downloads”. This is something that goes away after time and if you submit a review this is something that we can take a look at as well. It may also be something that you can help influence on your side. If you have downloads on your website that change all the time, for example if they are unique every time a user downloads them. Then that could be a reason for us to flag this as an uncommon download. This is because since it is always going to be unique, it is not something that we can check on our end. This is not something that is always problematic. It’s, just like the name says, something that we haven’t seen a lot of and we haven’t been able to double check. So, you can help us out by requesting a review or if you know that you are generating a lot of download files that are all unique you can think about finding a way to have fewer downloads that are easier for our systems to double check before your users download them.  I believe there is also a help centre article in regards to what can be done there.

 

Question 49:40 - We have an add-on for push notification. This add-on has some resources that are downloaded for external domains. We don’t have any alerts for malicious content but when we search for these domains many of the public articles(?) mark them as spam. So we were wondering if we need to stop using this add-on or require a special domain on which to host the push notification.

 

Answer 50:20 - How do you see that they are flagged as spam?

 

Reply 50:30 - We search the domain and saw some review sites and security sites that mention  that these domains have malicious content.

 

Answer 50:42 - I’d try to differentiate whether this is just considered spammy or actually malicious content. If it’s just seen as spammy then generally that wouldn’t be too much of a problem for us. It may be a sign that it’s a time to rethink the relationship with that company - are they really doing there right things for your users with the push notifications or if the set up with the push notification something that users don’t appreciate. If it is malicious then that’s something where you probably want to be even more critical since it’s likely that you’re including their javascript within your pages. So if their javascript does anything problematic or their website gets hacked and their javascript gets altered then essentially you’re putting this content on your site as well. So, in an extreme case if their website is serving malware then your website would be serving malware. That is something that could lead us to flagging your website. So you have to assess whether it is malicious or actually just something that is seen as spammy, which would be more of a quality thing for you to work on.

 

Question 52:42 - We just removed 30,000 old pages, they were old dating backing to 2000. Do you know how long it will take for Google to see that they aren’t there anymore?

 

Answer 53:17 - Probably a long time! If there are old pages that haven’t been changed in a long time then I assume we only crawl them every couple months. So, if you’re looking at a lot of pages like that I would assume a timeline of 3-6 months. If it’s something that you have to have removed quickly since they are very visible in search, people are clicking on them, and you don’t want them to go there then you can work with the URL removal tool. There is a limit to the number of URLs that you can remove but maybe there are things that you can submit, like subdirectories, that can remove these things quicker.

 

Follow up Question: My concern is that we have old content that may be seen as low quality ruining the rank for newer content with better answers.

 

Answer 54:34 - That is sometimes possible, that can happen. We do try to look at websites overall. So if you have a lot of low quality content so that could be something that could make us overlook high quality content. If you have older content that is okay quality then that’s not something that is a sign that it’s low quality. If you’re saying that you have new content and old content that’s on the same topic then maybe it makes more sense to redirect the old pages to the new ones. That way you can combine things instead of just deleting them.

 

Question 55:29 - We have separate URLs for desktop and mobile (example.com and m.example.com). The m.example.com in regards to hreflang refers to the desktop version of the pages. Should we change that to the m.example.com versions?

 

Answer 55:46 - Yes! Ideally you would have the m(dot) versions of the page linked by the hreflang to other m(dot) versions. So it should always be the same type of page linking to the other page that is of the same type. So, mobile to mobile and desktop to desktop for hreflang. That is especially the case for mobile first indexing where we have to have the hreflang link between the mobile versions. To some extent we still can figure it out if you have the canonicals but ideally it would be mobile to mobile for hreflang. I believe we documented this a while ago so… I’d try to look at that in the developer documentation.

 

Question 58:00 - New fashion brand webmaster becomes aware of adversarial SEO techniques. She becomes concerned that these will damage the brand’s sales and image. Somebody has tried to lower the brand’s ranking by posting fake reviews and spammy content on external discussion boards. 3rd party websites now have fake reviews about this brands product quality.  How could a webmaster fight this technically? How does Google protect websites from attacks happening outside of their websites?

 

Answer  58:42 - For the most part we are aware that this happens and it’s not something that I would spend too much time on. If you are seeing unnatural links appearing then I would use the disavow tool. If you are seeing that things are being done in regards to your brand in regards to your websites in ways that you think might be even illegal then that’s something that you can take action on, on that basic. That is not something that Google would be able to help you with. We can’t make that type of judgement call. But if you see this taking place on 3rd party website then you should get in touch with them as see what can be done to clean this up. I would generally try and get this cleared up, because then that give you time to focus on your core business and you tend not to lose as much sleep if this has any effect. For the most part I don’t think this is something where you would see any big issues.

 

Question 1:00:00 - How important is pages per session and average session duration on ranking a website? Google analytics benchmarking page values seems to be a bit lower than others in our industry. Should this be something that we try to improve and focus our efforts on?

 

Answer 1:00:16 - So we don’t use Analytics when it comes to search. So just because you see something in analytics doesn’t meant that we would use that as a ranking factor. It is something that can help looking at as a first step. However, especially when it comes to speed we do use speed as a ranking factor albeit a small one. Speed is something that has a fairly big influence on our how users interact with your pages. Oftentimes there is a much stronger effect when it comes to speed in regards to users behaviour and which pages users look at - those kinds of things. So if you are seeing that you site is slower or worse of in regards to speed compared to your competitors then that is something that I’d focus on even if it’s just for the users themselves. If users find a fast website then they generally tend to stick around longer, click around more, and tend to convert more. That is something that larger e-commerce websites have written some interesting reports on. Oftentimes, there are changes in behaviour that be measured down to the millisecond.

 

Question 1:02:10 - Is it good practice to send an XML sitemap for deleted URLs?

 

Answer 1:02:20 - You can do this but I’d only do this temporarily. If you’ve just removed a bunch of your pages and you want Google to re-crawl them a little bit faster the sitemap file can help us to that a little bit. In the long run though, you should remove those sitemap files so that we don’t run into a situation where we are being told to index URLs that aren’t there. So temporarily is fine, but for the long run I would not do that.

 

Question 1:03:03 - Brand search trends and social signals that affect the ranking?

 

Answer 1:03:06 - For the most part I don’t think we use social signals at all, and I don’t know what you mean when you say brand search trend. I think in general if people are searching for your brand that’s usually a good sign because it means that they explicitly want to go to your website. It also means that there is likely little competition if they are specifically looking for you in search. So that’s why it’s important to think about how you want to be seen in search in general, what brand you want to use online, what domain name you want to use, and having something that really makes it clear when someone is searching for this they actually want to go to your website. There is no real type of competition for that kind of query.

 

Question 1:04:42 - When a user is searching for your brand terms along with a product name. Does that mean that Google algorithm will better rank results for your brand when only the product category is searched?

 

Answer 1:05:17 - I don’t know if that would generally make sense. I think that if people are searching for a kind of product category and your brand then we’ll try and bring that out in the search results. But that doesn’t mean that if people are just searching for the product category that it’s automatically the most relevant one. So if they’re searching with the brand name then obviously we’ll try to show you there but it doesn’t necessarily mean that your general rankings will be higher just because some people are searching for your brand name as well.

 

Question 1:06:00 - Having content on a website with similar topics. Because of the nature of the page itself for the topic. We have it divided into two. One, where we are showing a kind of listings where it makes sense. And two, where we allow users to navigate from there to that specific topic. And we have another page speaking about the usefulness of that product. On one hand our domain page is not ranking that well but we have all this content based pages that are performing pretty well. I was wondering if we were able to merge these two pages would it make it better in terms of search visibility? Because if I do merge them I fear that it will become to huge and my users will not want to scroll through all of those things. How should I go about this?

 

Answer 1:07:05 - I would test it. I don’t think there is a clear yes or no answer to your question. I think it’s something that I’d generally try to test and see how users react to it. Think about how you could make a page not just this page plus another page but maybe more organically combined that’s a bit shorter or works better for the users. Especially if you’re looking at a website that has a lot of these kinds of pages then try to take a representative sample of these pages and do some AB testing to see if your hypothesis is correct. Maybe fewer pages would be better or maybe these pages work a lot better when they are not combined. That’s something were you need to think about what your overall goal is for the page. Don’t just focus on things like page views or if people are scrolling to the bottom. If your are trying to convince users to buy something then track conversions and see if users are actually completing this action.

 

Question 1:10:07 - We had a site that ranked for a specific topic then he rebranded. Now he potentially wants to rank on other topics. The site has many subdomains so there are going to be new subdomains created for these topics. An Adwords campaign was created for the old topic so we know it has some indirect effect on organic. But I was wondering if this can get us a negative effect on the organic positions of the old topic that we don’t want to get mentioned again.

 

Answer 1:10:55 - So there shouldn’t be any organic effect from running an ads campaign like this. That’s something where maybe you would see an indirect effect that people go to your content and then recommend it to others. But there wouldn’t be any effect from just running ads on a website. There is absolutely no connection between the ads side and the search side when it comes to ranking. In terms of the subdomains, sometimes it makes more sense to combine all of these into one larger primary website (unless it is language related).

 

Question 1:13:02 - Sometimes video ranks in the search results. Is there any way that a video can outrank and 5000 word blog post?

Answer 1:13:13 - Sure! It can! I doesn’t have to but it can! There is no specific word count limit when it comes to ranking pages. It’s not the case, as far as I know, that any of our algorithms count the words on a page and then make a ranking decisions based on that. Sometimes pages have very few words on them but are very useful. All of those options are possible. I wouldn’t blindly focus on the word count. I think word count can be used when you don’t really know the website and you’re trying to figure out which of them are really high quality and which of these articles are kind of thin. Then maybe the word count can give your roughly some information on those articles that you have there. In terms of ranking though, word count is not something that we would use as a 1 to 1 ranking factor. When we are looking at things such as video vs textual content that is something that is always very tricky to balance the ranking. Similar with regards to news, news carousel, and images. All of these things are able to come together and we have to look at a lot of information on a page that has lots of textual content vs a page that has a lot of images or videos on it. So we have to think about what would be the most relevant result for users in those cases. And the mix of the types of search results tends to change over time.