How to take out international URL from google US index/hreflang help
-
Hi Moz Community,
Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website.
The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage.
I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search.
Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue.
Thanks,
-Reed -
Hi Sheena, sorry I didn't respond sooner, I wasn't receiving any notifications.
Thank you very much for your answer though, this was extremely helpful and helped verify that what I was thinking was correct, with some added help from you.
I didn't think taking away the 301 was the best approach, but from a bosses standpoint he sees it as them getting clicks that shouldn't be theirs, I just have to do my best job of explaining why it's better for long term.
The hreflang is in place and I think the best approach would be to consolidate international domains to the .com ccTLD's
Thanks again, very helpful.
-Reed -
I'm working on a very similar scenario, where .com.au pages are ranking in Google US and .com pages are ranking in Google AU (above .com.au pages).
We are moving forward with the hreflang attribute since it was specifically introduced to help search engines serve the correct language or regional URL to searchers. In helping search engines index and serve the localized version of your content, “hreflang” also prevents duplicate content penalties by telling Google that each potential “duplicate” is actually an alternative for users who require an alternate language version. * We see this as a short-term goal, as we plan to eventually consolidate the ccTLDs to the .com site.
Here are some international SEO / hreflang resources that might help:
- https://support.google.com/webmasters/answer/189077?hl=en
- http://moz.com/blog/hreflang-behaviour-insights
- http://moz.com/blog/the-international-seo-checklist
- Anything from Aleyda Solis &/or Gianluca Fiorelli
- http://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
- http://www.themediaflow.com/tool_hreflang.php
Also, since the AU subdomain pages were ranking well, I probably would have left the redirect in place rather than let it go to a 404. Then focus on mapping out the equivalents between the .com and .com.au sites. This is a very tedious project, but the last 2 links I shared above really help move things along once you have all the URL equivalents mapped out.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Dropped and Indexed Pages Went Down on Google?
Hi there, We run an e-commerce site on Shopify. Our Domain Authority was 28 at the start of our campaign in May of this year. We also had 610 indexed pages on Google. We did some SEO work which included: Renaming Images for SEO Adding in alt tags Optimizing the meta title to "Product Name - Keyword - Brand Name" for products Optimizing meta descriptions Transition of Hubspot blog to Shopify (it was on a subdomain at Hubspot previously) Fixing some 404s Resubmitting site map after the changes Now it is almost at the 3-month mark and it looks like our Domain Authority has gone down 4 points to 24. The # of indexed pages has gone to down to 555. We made sure all our SEO updates weren't spammy or keyword-stuffed, but took a natural and helpful-sounding approach. We followed guidelines. So there shouldn't be any penalty right? I checked site traffic and it does not coincide with the drop. Our site traffic remains steady. I also looked at "site:" as well as conducted some test searches for the important pages (i.e. main pages, blog pages, and product pages) and they still come up on Google. So could it only be non-important pages being deindexed? My questions are: Why did both the Domain Authority and # of indexed pages go down? Is there any way to see which pages were deindexed? I checked Google Search Console, but couldn't find it. Thank you!
Intermediate & Advanced SEO | | kindalpaca70 -
Crawling/indexing of near duplicate product pages
Hi, Hope someone can help me out here. This is the current situation: We sell stones/gravel/sand/pebbles etc. for gardens. I will take a type of pebbles and the corresponding pages/URL's to illustrate my question --> black beach pebbles. We have a 'top' product page for black beach pebbles on which you can find different types of quantities (differing from 20kg untill 1600 kg). There is not any search volume related to the different quantities The 'top' page does not link to the pages for the different quantities The content on the pages for the different quantities is not exactly the same (different price + slightly different content). But a lot of the content is the same. Current situation:
Intermediate & Advanced SEO | | AMAGARD
- Most pages for the different quantities do not have internal links (about 95%) But the sitemap does contain all of these pages. Because the sitemap contains all these URL's, google frequently crawls them (I checked the logfiles) and has indexed them. Problems: Google spends its time crawling irrelevant pages --> our entire website is not that big, so these quantity URL's kind of double the total number of URL's. Having url's in the sitemap that do not have an internal link is a problem on its own All these pages are indexed so all sorts of gravel/pebbles have near duplicates. My solution: remove these URL's from the sitemap --> that will probably stop Google from regularly crawling these pages Putting a canonical on the quantity pages pointing to the top-product page. --> that will hopefully remove the irrelevant (no search volume) near duplicates from the index My questions: To be able to see the canonical, google will need to crawl these pages. Will google still do that after removing them from the sitemap? Do you agree that these pages are near duplicates and that it is best to remove them from the index? A few of these quantity pages do have intenral links (a few procent of them) because of a sale campaign. So there will be some (not much) internal links pointing to non-canonical pages. Would that be a problem? Thanks a lot in advance for your help! Best!1 -
Our Web Site Is candere.com. Its PA and back link status are different for https://www.candere.com, http://www.candere.com, https://candere.com, and http://candere.com. Recently, we have completely move from http to https.
How can we fix it, so that we may mot lose ranking and authority.
Intermediate & Advanced SEO | | Dhananjayukumar0 -
Google v bing/yahoo
Had a penguin unnatural link building penalty was lifted last June. Since then no significant recovery in rankings. Saying that why do we still rank on page 1 on Bing and Yahoo but rank nowhere in Google. Any suggestions. Thanks David
Intermediate & Advanced SEO | | Archers0 -
Recovering from index problem (Take two)
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down! Below is my original message. Afterwards, I've added some update info. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. UPDATE OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week. Any ideas would be much appreciated!
Intermediate & Advanced SEO | | Blink-SEO0 -
Google Map embed on Contact Us page
Greetings Mozzers, When doing a website, we generally link a Google Map image of the company to the Google Maps in a new tab. Reading other questions made us wonder. Is it better to do the above or is it more beneficial to SEO to have a Google Map embedded on the website. Which I guess means it links Google to the Business in one way? Thanks for any/all responses.
Intermediate & Advanced SEO | | MonsterWeb280 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Not ranking as well in Yahoo/Bing as in Google. Why?
For my top 132 keywords, my target ranks an average of 7 positions lower in Bing and an average of 9 positions lower in Yahoo. Does anyone have some good resources to read, or any tips on why we would rank so much poorer in Bing/Yahoo than in google?
Intermediate & Advanced SEO | | adriandg0