Local cTLD site not showing up in local SERP
-
I have 1 website with 2 cTLD. 1 is with .be another .nl. Both are in Dutch and pretty much with the same content but a different cTLD.
The problem I have is that the .nl website is showing up in my serp on google.be. So I'm not seeing any keyword rankings for the .be website. I want to be able to see only .nl website serp for google.nl and .be serp on google.be
I've already set up hreflang tags since 2-3 weeks and search console confirmed that it's been implemented correctly. I've alsy fetched the site and requested a re-index of the website.
Is there anything else I can do? Or how long do I have to wait till Google will update the serp?
-
Update: Still no improvements in the results even after all the changes have been implemented. Anyone with other suggestions perhaps?
-
Hi Jacob,
Don't use the canonical across both countries. Google will figure out the correct country targeting eventually. If you do this, it will only hurt you.
You won't be penalized for duplicate content, but you can be omitted from search results (per page) if Google has not figured out the country targeting yet. It might think it is the same content, but be patient.
Another thing you can do is enable people to toggle between the .nl and .be site, and accept (for the time being) that you rank with the 'wrong' site.
I'm pretty sure the fix you mentioned below will help you!
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
-
Hi Linda,
Thanks for the feedback.
- The hreflang format is corret, i just checked again. nl-nl and nl-be.
- The canonical url doesn't point to the NL or vice versa. It did have another URL as we're getting data from a different system and using wordpress to generate the userfriendly URL. So The canonical still has a different URL. I've made the change to make it exactly the same as the one shown in the URL. I hope it will help in some way.
- Geotargeting config was set correctly for each account in Search console from the beginning.
- All backlinks are from .be domains except the one with a high spam score. I've already made the request to remove them.
I'm also thinking about referring the canonical url of both nl and be website to the .be domain as the content is the same. What i'm thinking now is that there is a case of duplicate content and perhaps the .be website is somehow being penalized as the one with the duplicate content which is why the nl website is showing up higher than the .be website. Would this help? I mean if I do this, would Google show the correct domain in the correct engine despite both having same content?
-
Hi Antonio,
I actually meant that if you have duplicate content of some kind, your page example.be/xyz may have:
- a canonical to example.be/xyy
- your hreflang might point to example.be/xyz and example.nl/xyz - this should also be example.be/xyy
Did you also check if you used the right format for the hreflang (nl-be)?
And for geotargeting, it is not set by default, so I'd recommend to set it anyway. It can't hurt.
-
Yes, canonicals maybe are pointing to the .nl site, good point Linda. In the same SF crawl Jacob you can check that.
If the domain is .be, Google Search Console will automatically target the domain to Belgium.
-
- This item it's OK
- Yes, you can check it on Crawl stats under Crawl menu. Just to be sure, check the log. There's any user agent detector that can redirect Googlebot to other page?. Check that using "Fetch as Google" under the same menu, or change the useragent in Screaming Frog and crawl your site if there's a differente between the default SF user agent and Googlebot
- Yes, you should use one method, if the tag under head doesn't work (but should), try with the sitemap annotations
- The Spam score should be addressed, but the quality links are from Belgium? (or Belgium oriented sites?)
-
My experience tells me you might need to wait a bit longer.
Other problems you might have:
- Canonicals not pointing to the same URLs as the hreflangs.
- Geotargeting settings in Google Search Console.
- Belgium backlinks (from .be sites) - but this has been mentioned by Antonio.
-
Hey Jacob:
- Do you use Screaming Frog? would be great to double check if there's any directive with noindex that it's hurting your .be visibility (about a few of your pages are being indexed). The "site:" command it's pretty useful to use it on-the-fly, but I would recommend always to check if the URLs in the sitemap.xml are being indexed. Wait 1-2 days to see if after submiting your sitemap there's any change
- I assume you are using Wordpres in a Apache server running php, so, in your File Manager (cPanel) or your FTP software, go to the root directory (one level up to public_html), you should have a "logs" folder with a couple of compressed files. Un-zip them and open it with Notepad or any text editor. Search for Googlebot in the logs and see the most recent request from Googlebot
- Yoast it's a good plugin, I use it, but for this case, maybe should be good to deactivate this feature of the plugin and search for another than can handle hreflang, or do it manually
- Yes, maybe your .be ecosystem is pointing to the .nl site, check it with Open Site Explorer and if this is the case, request a change of domain of each site owner. If not, you should begin to build those links in a proper way
-
Thanks for the reply Antonio.
- Checked the robots and it's not blocking anything. All pages are being indexed as well. when I use site:website.be I do see the results. It's just that the .nl website seems to overtake the .be results.
- Where could I find the log files from Googlebot?
- I'm using Yoast SEO pluging for the XML sitemaps and there's no indication of the language there. i'll double check again.
- Concerning the backlinking, do you mean link building?
I've submitted my sitemap to search console and I did notice that only a few of my pages have been indexed. But When I use "site:" I do get the pages.
-
In my experience this should take no more than 2 weeks after checking href lang are set up properly (but will depend if Googlebot crawl both sites frecuently), the questions I will ask myself in this case are:
- It's pretty dumb, but sometimes we forget the basics, like: are you blocking the site with the robots.txt? noindex tags? something?
- Double check if the href lang is properly implemented
- In your log files there's any presence of Google bot on both sites?
- Assuming you are using tags in the header for href lang: Have you tried to force the href lang implementation with sitemap.xml? https://support.google.com/webmasters/answer/189077?hl=en
- Have you tried to backlink the .be domain from business partners in Belgium?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local Strategy for Multiple Domain Integration
Hello, We are a locally driven business with two locations. Currently, each location has its own local site and are linked to from our central domain (3 domains total). We are discussing whether we should integrate the local sites into location pages on our core domain. However, we would also prefer to keep the ‘local’ domains live. Is this a viable strategy and what would we need to do to ensure the local sites won’t cannibalize our efforts with the main domain? Also, should we remove the contact information on those local sites to avoid NAP issues? The other option would be to build out the local domains but that could raise concerns over budget and potentially expanding into the future. And we would like the main domain to take presendence. A few additional notes on this: Each location has its own brand name and contact information. Traffic across all 3 sites is about the same. We are also considering using silos with sub-folders to build out local service pages. We understand how to set up location pages but are asking more in terms of overall strategy and ideal way to position all 3 sites. Any help or insight would be very appreciated. Thank you in advance.
Local Website Optimization | | Ben-R
Best,0 -
How to Get 1st Page Google Rankings for a Local Company?
Hi guys, I'm owning a London removal company - Mega Removals and wants to achieve 1st page rankings on Google UK for keywords like: "removals London", "removal company London", "house removals London" but have no success so far. I need professional advice on how to do it. Should I hire an SEO or should focus on content? I will be very grateful for your help.
Local Website Optimization | | nanton1 -
Local SEO for Multiple Locations - Is this the best approach?
Hi everyone! I previously have worked with single-location companies, and am now working for a company that is continuously growing and adding new locations. We are a financial institution that currently has 12 locations, and we should have 15+ locations by year-end 2017. Seeing as we have all of these locations, I thought the following approach would be the best for increasing our presence in local search. Our primary keyword is "credit union in location". Our search traffic has increased heavily over last year, but is down from the beginning of the year. I've gone through and done the following: Freshened up the content on the main website Created pages for each of our locations around April-end Attributed these location page URLs to our Google My Business locations Verified each location Wrote unique content for each page Our primary keyword rankings seem to fluctuate weekly. My next steps are to get our web design company to add the following: Structured Data on all location pages The ability to change SEO title and meta descriptions on location pages Sitemap (there is none currently, and I've been fighting them to get one added because it isn't needed.) I also plan on utilizing Moz Local to manage our local listings. After this is done I plan on finding ways for us to build links for each location, like the chambers of commerce in each city and local partnerships. Is this the best approach for our overall goal, and should I continue? Is there anything I should change about our current approach? I appreciate the help!
Local Website Optimization | | PelicanStateCU0 -
Company sells home appliances and commercial appliances. What is the best way to differentiate the two on our site for the best user experience/SEO?
Should we structure it starting at the homepage with the user selecting for home or for business, that way they have to make a selection before moving further OR should we somehow differentiate in the navigation using the top menu tabs, dropdowns, etc?
Local Website Optimization | | dkeipper1 -
Not displaying the address and its effect on local rankings.
I have just started working with a plumber in my local area to provide a website and generate leads from a combination of SEO, social media and advertising. The issue is that he is adamant that his address should not be displayed anywhere on the site or on any of the citations we are looking to build. This is even after I explained the importance of this information to rankings and the fact that his address can be hidden from view in local listings. I have already come to the conclusion that getting in the typical 7 pack will be near impossible without verifying the address or building citations without a address. But I would like to hear your thoughts on whether you believe ranking organically is still a possibility or whether I should just focus on social / advertising.
Local Website Optimization | | yabyy140 -
Does Schema Replace Conventional NAP in local SEO?
Hello Everyone, My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO? Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case. Thanks, and I look forward to what the community has to say on this matter.
Local Website Optimization | | toddmumford0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0 -
Sites Verification Issues
We have a group of automotive dealerships by a website provider that causes issues when trying to verify our sites. Because they use Analytics for their data program, they install a code into our websites-stopping us from doing so properly in our back end. We also cannot verify ourselves in webmasters or adwords. We can't actually "own" any of our sites since they run a java query script from within the website. They also do not allow the use of iframes or scripts, so we can't even use the container to verify these sites. Any help or insight would be greatly appreciated as I am sure there is some way to break this to get our data and be verified.
Local Website Optimization | | spentland0