Index problems
-
“The website http://www.vaneyckshutters.com/nl/ does not show in the index of Google (site:vaneyckshutters.com/nl/). This must be the homepage in the Netherlands. Previously, the page www.vaneyckshutters.com was redirected to /nl/. This page is accessible now with a canonical tag to http://www.vaneyckshutters.com/nl/ in the hope to let /nl/ be indexed. When we look at the SERPS for keyword ‘shutters’, the page http://www.vaneyckshutters.com/ is shown in Google.nl on #32 and in Belgium #3.
Problem & question: Why is it that /nl/ has not been indexed properly and why is it that we rank with http://www.vaneyckshutters.com on ‘shutters’ instead the/nl/ page?”
-
New update at:
-
Mmm... I wonder if Google decided to do so because of some external factor as, for instance, being the backlinks your website has pointing to www.domain.com and not www.domain.com/nl/, especially if the backlink can be defined as coming from netherland's site or targeting nl audience.
Said that, I still think that maybe your best solution would have been putting all the NL version under the main root and not in a /nl/ subfolder.
Example:
- www.domain.com <<< NL version
- www.domain.com/fr/ <<<< French version
- http://www.vaneyckshutters.com/kwaliteit-shutters/
- http://www.vaneyckshutters.com/fr/applications-de-nos-shutters/
and so on.
Obviously, right now this is not an easy solution, because it implies a sort of migration (tiny and internal, but migration anyway with everything related like 301 from old to new URLs), but possibly is the really valid one.
-
Anyone?
-
Hi Gianluca,
Thanks for your response and thoughts. We have had a 301 redirect from the domain to the /nl/ but that is where the problem began. We had multiple situations.Original situation: .com redirecting (301) to /nl/ => no indexation for /nl/
Solution 1: Delete 301 redirect and add canonical tag to /nl/ and add href lang tags
The ranking has been better when implementing solution 1, but /nl/ has still not been indexed. But the pages after /nl/ are indexed. Strange isn’t it? We see the same trend at store.apple.com redirecting to store.apple.com/us/ but /us/ has not been indexed. Our possible solution #2 was to setup NL content on .com and FR content on /fr/. But I’am afraid that I will lose positions in Belgium (now ranking #3).
-
Hi Gianluca
My apologies here. I can see where my answer lacked when failing to bring up there not being a hreflang tag for the /nl/. My suggestion was to review the hreflang tag resources given, but I failed to mention that putting that hreflang tag in place would be a help to the site.
I understand my error, I appreciate you bringing it up and calling me on it. It won't happen again. Not taken personally at all - I appreciate you doing so, it helps me be more clear moving forward.
Patrick
-
Hi "Happy SEO",
first of all, my question is:
Why do you really need to have the /nl/ subfolder shown in the index, if the root itself is in Dutch by default. Experience tells me that it is much easier to receive links to a domain than a subfolder, so pretending to have this one indexed is somehow adding a difficulty in term of link building.
Said that, if you really want the /nl/ subfolder to be indexed instead of the root domain, why don't simply redirect 301 the domain name to the /nl/ subfolder?
From the little I know about your site, that would be the most logic thing to do
-
Mmm... Patrick, albeit you're sharing good resources and giving good general advices, you are not giving a proper answer... and as a moderator about International SEO Q&A I saw you tend to do this frequently.
For instance, you are talking about the hreflang annotation toward the /fr/ homepage, but you are not advising that it is missing the hreflang annotation to the /nl/ version of the site and URL. Using the hreflang would be probably more effective that using the rel="canonical" in a case like this one.
So, you are not really answering to the problem "Happy SEO" has, which is why Google is indexing the root domain despite of the canonicalization toward the /nl/ subfolder.
Please, don't take it personally, but it is something I had to point it out.
-
Hi there
First, I would make sure that you review international SEO resources and that you have everything tagged properly. Your /fr/ site is being indexed, and your .com has a hreflang tag pointing to the /fr/ page.
I would review your hreflang tags, canonical tags, and your geo-targeting in Search Console for all of your sites. Make sure that all of that is spot on and resubmit your sitemaps to Google.
Hope this helps - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why would Google not index all submitted pages?
On Google Search console we see that many of our submitted pages weren't indexed. What could be the reasons? | Web pages |
Technical SEO | | Leagoldberger
| 130,030 Submitted |
| 87,462 Indexed |0 -
Google crawling but not indexing for no apparent reason
Client's site went secure about two months ago and chose root domain as rel canonical (so site redirects to https://rootdomain.com (no "www"). Client is seeing the site recognized and indexed by Google about every 3-5 days and then not indexed until they request a "Fetch". They've been going through this annoying process for about 3 weeks now. Not sure if it's a server issue or a domain issue. They've done work to enhance .htaccess (i.e., the redirects) and robots.txt. If you've encountered this issue and have a recommendation or have a tech site or person resource to recommend, please let me know. Google search engine results are respectable. One option would be to do nothing but then would SERPs start to fall without requesting a new Fetch? Thanks in advance, Alan
Technical SEO | | alankoen1230 -
Homepage indexation issue
Hello all, I've been scratching my head about this one for a while now... Let me explain the situation. I'm working on a multi-lingual website. Visitors are redirected (301) when they visit the homepage to the correct domain.com/en/default.html, domain.com/nl/default.html, domain.com/fr/default.html or domain.com/de/default.html based on browser language. I have doubts about the impact on the ability for Google to index the website because of that, but that's a problem for another day. The problem I'm having right now, is that domain.com/nl/default.html, domain.com/de/default.html and domain.com/fr/default.html are all indexed. When I search for the URL in Google I get the correct page on number one so I'm pretty sure those are indexed correctly. When I search for domain/en/default.html though, the homepage appears without /en/default.html extension. Does this mean Google assumes the domain.com page is the same as domain.com/en/default.html even though the redirect that's in place? Would be great if someone could shed some light on this. Thanks in advance!
Technical SEO | | buiserik0 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
AJAX and High Number Of URLS Indexed
I recently took over as the SEO for a large ecommerce site. Every Month or so our webmaster tools account is hit with a warning for a high number of URLS. In each message they send there is a sample of problematic URLS. 98% of each sample is not an actual URL on our site but is an AJAX request url that users are making. This is a server side request so the URL does not change when users make narrowing selections for items like size, color etc. Here is an example of what one of those looks like Tire?0-1.IBehaviorListener.0-border-border_body-VehicleFilter-VehicleSelectPanel-VehicleAttrsForm-Makes We have over 3 million indexed URLs according to Google because of this. We are not submitting these urls in our site maps, Google Bot is making lots of AJAX selections according to our server data. I have used the URL Handling Parameter Tool to target some of those parameters that are currently set to let Google decide and set it to "no urls" with those parameters to be indexed. I still need more time to see how effective that will be but it does seem to have slowed the number of URLs being indexed. Other notes: 1. Overall traffic to the site has been steady and even increasing. 2. Google bot crawls an average of 241000 urls each day according to our crawl stats. We are a large Ecommerce site that sells parts, accessories and apparel in the power sports industry. 3. We are using the Wicket frame work for our website. Thanks for your time.
Technical SEO | | RMATVMC0 -
Is it a problem to have an image + link in your menu
Hi, My menu has a image with links to some of the main pages on the site and text underneath it explaining what the banner is. Will it be beneficial or harmful to have the text hyperlinked to the same pages the images go to?
Technical SEO | | theLotter0 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Removing some of the indexed pages from my website
I am planning to remove some of the webpages from my website and these webpages are already indexed with search engine. Is there any way by which I need to inform search engine that these pages are no more available.
Technical SEO | | ArtiKalra0