Blog page won't get indexed
-
Hi Guys,
I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now.
I found this in the robots.txt file:
Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/
I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt?
Cheers!
-
Thanks alot!
-
Hi Dirk,
Good observation, I missed the canonical part somehow. So, google is indexing the canonical URLs here which doesn't have /blog/ in it and that's the problem. Have a look at the indexed page for this particular instance here. Non /blog/ instance is indexed, which will take you to its /blog/ version with wrong canonical URL.
Solution: Either remove the canonical URLs on these pages to point them to the current page itself. And yeah! As rightly mentioned by Dirk, do a proper /blog/ page linking from the blog page and other pages from where you're linking these articles.
-
This is definitely the issue. Fix that canonical and they'll be indexed.
-
To update - even worse: on the blog itself you are linking to the canonical version - not to the /blog/ version. So it would be impossible for Google to index /blog/ type of content.
If you do woontrends 2016 site:www.keukensduitsland.nl you will notice that the canonical version is properly indexed (even with the strange js redirect.
Dirk
-
It's not related to the robots.txt - you can easily check that in Webmastertools (Crawl > Robots.txt tester)
First issue is the location of the link - if you put a small link to the blog hidden in the left corner at the bottom of the page Google is not going to attribute a lot of importance to this link.
Most important issue on your blog articles is the canonical - example:
http://www.keukensduitsland.nl/blog/woontrends-2016/ has as canonical url: http://www.keukensduitsland.nl/woontrends-2016/ - however this page will redirect you with javascript to the blog article.
Make the canonical self referencing and do a proper redirect on the other pages (301 rather than js redirect)
Dirk
-
Hi Happy SEO,
Well, the robots.txt looks find here. Could you try to fetch any of the blog page/post as google in the search console and share the screenshot here?
Also, to cross check the robots.txt (which looks fine though), you have robots.txt tester in search console where you can put any blog page/post to check if bots can crawl it. Please share a screenshot of that as well.
On a separate note, the sitemap.xml link mentioned in the robots.txt (http://www.keukensduitsland.nl/sitemap.xml) is broken. Fix that as well.
-
Hi Nitin,
The URL is www.keukensduitsland.nl (/blog). The link to the blog page is in the bottom left corner called "Keukennieuws".
-
Hi Happy SEO,
Could you please share the blog URL here? Sounds like an interesting issue and would love to give a try to help you with this
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Control indexed content on Wordpress hosted blog...
I have a client with a blog setup on their domain (example: blog.clientwebsite.com) and even though it loads at that subdomain it's actually a Wordpress-hosted blog. If I attempt to add a plugin like Yoast SEO, I get the attached error message. Their technical team says this is a brick wall for them and they don't want to change how the blog is hosted. So my question is... on a subdomain blog like this... if I can't control what is in the sitemap with a plugin and can't manually add a sitemap because the content is being pulled from a Wordpress-hosted install, what can I do to control what is in the index? I can't add an SEO plugin... I can't add a custom sitemap... I can't add a robots.txt file... The blog is setup with domain mapping so the content isn't actually there. What can I do to avoid tags, categories, author pages, archive pages and other useless content ending up in the search engines? 7Zo93b2.png
Technical SEO | | ShawnW0 -
Duplicate page titles for blog snippets pages
I can't figure the answer to this issue, on my blog I have a number of pages which each show snippets and an image for each blog entry, these are called /recent-weddings/page/1 /2 /3 and so on. I'm getting duplicate page titles for these but can't find anywhere on Wordpress to set a unique title for them. So http://www.weddingphotojournalist.co.uk/recent-weddings/…/2/ has the same title as http://www.weddingphotojournalist.co.uk/recent-weddings/…/3/
Technical SEO | | simonatkinsphoto0 -
Post Site Migration - thousands of indexed pages, 4 months after
Hi all, Believe me. I think I've already tried and googled for every possible question that I have. This one is very frustrating – I have the following old domain – fancydiamonds dot net. We built a new site – Leibish dot com and done everything by the book: Individual 301 redirects for all the pages. Change of address via the GWT. Trying to maintain and improve the old optimization and hierarchy. 4 months after the site migration – we still have to gain back more than 50% of our original organic traffic (17,000 vs. 35,500-50,000 The thing that strikes me the most that you can still find 2400 indexed pages on Google (they all have 301 redirects). And more than this – if you'll search for the old domain name on Google – fancydiamonds dot net you'll find the old domain! Something is not right here, but I have no explanation why these pages still exist. Any help will be highly appreciated. Thanks!
Technical SEO | | skifr0 -
Why isn't my site not searchable from google?
I am having a hard time figuring out why is it that when I search for my website name, it didn't show up in google's search result? Here's a link to my site. I've been twiddling for days looking for answers in my google webmaster tools. Here's a link of the crawl stats from google webmaster tool. As you can see it is actually crawling some pages. However my looking at my indexed status, I am getting 0 as you can see here (http://cl.ly/image/3G1R1p0b3k1P). I've double checked for my robots.txt and nothing seemed to be out of the ordinary there. I am not blocking anything. Any ideas why?
Technical SEO | | herlamba0 -
Page not indexed but still has a PageRank, how?
http://www.optiproerp.com/products.aspx page is not indexed in Google but still has a PageRank of 1. How? Regards
Technical SEO | | IM_Learner0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0