How to take out international URL from google US index/hreflang help
-
Hi Moz Community,
Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website.
The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage.
I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search.
Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue.
Thanks,
-Reed -
Hi Sheena, sorry I didn't respond sooner, I wasn't receiving any notifications.
Thank you very much for your answer though, this was extremely helpful and helped verify that what I was thinking was correct, with some added help from you.
I didn't think taking away the 301 was the best approach, but from a bosses standpoint he sees it as them getting clicks that shouldn't be theirs, I just have to do my best job of explaining why it's better for long term.
The hreflang is in place and I think the best approach would be to consolidate international domains to the .com ccTLD's
Thanks again, very helpful.
-Reed -
I'm working on a very similar scenario, where .com.au pages are ranking in Google US and .com pages are ranking in Google AU (above .com.au pages).
We are moving forward with the hreflang attribute since it was specifically introduced to help search engines serve the correct language or regional URL to searchers. In helping search engines index and serve the localized version of your content, “hreflang” also prevents duplicate content penalties by telling Google that each potential “duplicate” is actually an alternative for users who require an alternate language version. * We see this as a short-term goal, as we plan to eventually consolidate the ccTLDs to the .com site.
Here are some international SEO / hreflang resources that might help:
- https://support.google.com/webmasters/answer/189077?hl=en
- http://moz.com/blog/hreflang-behaviour-insights
- http://moz.com/blog/the-international-seo-checklist
- Anything from Aleyda Solis &/or Gianluca Fiorelli
- http://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
- http://www.themediaflow.com/tool_hreflang.php
Also, since the AU subdomain pages were ranking well, I probably would have left the redirect in place rather than let it go to a 404. Then focus on mapping out the equivalents between the .com and .com.au sites. This is a very tedious project, but the last 2 links I shared above really help move things along once you have all the URL equivalents mapped out.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Urgent Help - Ecommerce URL best practice for SEO
Guys i need some urgent help here as we need to get this sorted out soon. We have a page similar to wayfair shop the look: www.wayfair.com/Shop-The-Look/ What are the best practices for URL structure if we applies 2-3 filters? Is wayfair style good for SEO? FYI: We create our crawlable, link friendly AJAX website using pushstate() but unsure of the structure for this case. We followed http://moz.com/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate advice.
Intermediate & Advanced SEO | | WayneRooney0 -
Internal Search Results Appear in Google SERPS
My friend is running an ecommerce store selling apparels. How can we make internal search results to appear in Google SERPS and rank them? For example: the query is "peplum dress". You type the query into the internal search box and it returns a set of results. In this case, it's product listing. How can we optimize and rank it so it appears in Google SERP? Do we do it the traditional way in terms of links? Say URL is: http://www.asos.com/search/peplum-top?q=peplum+top&r=2 And we build links to it? Some of you may ask why not create a dedicated page for this, the reason being we'd have too many categories if we were to create one for each. Thoughts?
Intermediate & Advanced SEO | | WayneRooney0 -
Https://www.mywebsite.com/blog/tag/wolf/ setting tag pages as blog corner stone article?
We do not have enough content rich page to target all of our keywords. Because of that My SEO guy wants to set some corner stone blog articles in order to rank them for certain key words on Google. He is asking me to use the following rule in our article writing(We have blog on our website):
Intermediate & Advanced SEO | | AlirezaHamidian
For example in our articles when we use keyword "wolf", link them to the blog page:
https://www.mywebsite.com/blog/tag/wolf/
It seems like a good idea because in the tag page there are lots of material with the Keyword "wolf" . But the problem is when I search for keyword "wolf" for example on the Google, some other blog pages are ranked higher than this tag page. But he tells me in long run it is a better strategy. Any idea on this?0 -
Town and County pages taking months to index.
Hi, At http://www.general-hypnotherapy-register.com/regional-hypnotherapy-directory/ we have a load of town and county pages for all of the hypnotherapists on the site a) I have checked all of these links and they are spiderable. b) About a month back I noticed after the site changes, not entirely sure why, but the site was generating rogue pages, eg http://www.general-hypnotherapy-register.com/hypnotherapists/page/5/?town=barnsley instead of http://www.general-hypnotherapy-register.com/hypnotherapists/?town=barnsley We have added meta no index, no follow to these rogue pages around 4 weeks ago..however these pages still have a google cache date of Oct 4th predating these meta changes c) There are examples of the pages we do want, indexed, and ranking too on page 1, site:www.general-hypnotherapy-register.com/hypnotherapists eg http://www.general-hypnotherapy-register.com/hypnotherapists/?town=ockham however these pages are few and far between, these have a recent google cache date of Nov 1 **d) **The xml sitemap has all of the correct URLS, but in webmaster tools, the amount of pages indexed has been stubbornly flat at 2800 out of 4400 for 4 weeks now e) Query Paramaters: for ?town and ?county in webmaster tools, are set to Yes/Specifies Would love any suggestions, Thanks. Mark.
Intermediate & Advanced SEO | | Advantec0 -
Google suddenly indexing and displaying URLs that haven't existed for years?
We recently noticed google is showing approx 23,000 indexed .jsp urls for our site. These are ancient pages that haven't existed in years and have long been 301 redirected to valid urls. I'm talking 6 years. Checking the serps the other day (and our current SEOMoz pro campaign), I see that a few of these urls are now replacing our correct ones in the serps for important, competitive phrases. What the heck is going on here? Is Google suddenly ignoring rewrite rules and redirects? Here's an example of the rewrite rules that we've used for 6+ years: RewriteRule ^(.*)/xref_interlux_antifoulingoutboards&keels.jsp$ $1/userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID [R=301] Now, this 'bottom paint' url has been incredibly stable in the serps for over a half decade. All of a sudden, a google search for 'bottom paint' (no quotes) brings up the jsp page at position 2-3. This is just one example of something very bizarre happening. Has anyone else had something similar happen lately? Thank You <colgroup><col width="64"></colgroup>
Intermediate & Advanced SEO | | jamestown
| RewriteRule ^(.*)/xref_interlux_antifoulingoutboards&keels.jsp$ $1/userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID [R=301] |0 -
Help me choose a new URL structure
Good morning SEOMoz. I have a huge website, with hundreds of thousands of pages. The websites theme is mobile phone downloads. I want to create a better URL structure. Currently an example url is /wallpaper/htc-wildfire-wallpapers.html My issue with this, first and foremost is it's a little spammy, for example the fact it's in a wallpaper folder, means I shouldn't really need to be explicit with the filename, as it's implied. Another issue arises with the download page. For example /wallpaper/1234/file-name-mobile-wallpaper.html Again it's spammy but also the file ID, is at folder level, rather than within the filename. Making the file deeper and loses structure. I am considering creating sub domains, based on model, to ensure a really tight silo. i.e htc.domain.com/wallpaper/wildfire/ and the download page would be htc.domain.com/wallpaper/file-name-id/ But due to restrictions with the CMS, this would involve a lot of work and so I am considering just cleaning up the url structure without sub domains. /wallpaper/htc/wildfire/ and the download page would be /wallpaper/file-name-id/ What are your thoughts? Somebody suggested having the downloads in no folder at all, but surely it makes sense for a wallpaper, to be in a wallpaper folder and an app to be in an app folder? If they were not in a folder, I'd need to be more explicit in the naming of the files. Any advice would be awesome.
Intermediate & Advanced SEO | | seo-wanna-bs0