How to take out international URL from google US index/hreflang help
-
Hi Moz Community,
Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website.
The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage.
I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search.
Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue.
Thanks,
-Reed -
Hi Sheena, sorry I didn't respond sooner, I wasn't receiving any notifications.
Thank you very much for your answer though, this was extremely helpful and helped verify that what I was thinking was correct, with some added help from you.
I didn't think taking away the 301 was the best approach, but from a bosses standpoint he sees it as them getting clicks that shouldn't be theirs, I just have to do my best job of explaining why it's better for long term.
The hreflang is in place and I think the best approach would be to consolidate international domains to the .com ccTLD's
Thanks again, very helpful.
-Reed -
I'm working on a very similar scenario, where .com.au pages are ranking in Google US and .com pages are ranking in Google AU (above .com.au pages).
We are moving forward with the hreflang attribute since it was specifically introduced to help search engines serve the correct language or regional URL to searchers. In helping search engines index and serve the localized version of your content, “hreflang” also prevents duplicate content penalties by telling Google that each potential “duplicate” is actually an alternative for users who require an alternate language version. * We see this as a short-term goal, as we plan to eventually consolidate the ccTLDs to the .com site.
Here are some international SEO / hreflang resources that might help:
- https://support.google.com/webmasters/answer/189077?hl=en
- http://moz.com/blog/hreflang-behaviour-insights
- http://moz.com/blog/the-international-seo-checklist
- Anything from Aleyda Solis &/or Gianluca Fiorelli
- http://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
- http://www.themediaflow.com/tool_hreflang.php
Also, since the AU subdomain pages were ranking well, I probably would have left the redirect in place rather than let it go to a 404. Then focus on mapping out the equivalents between the .com and .com.au sites. This is a very tedious project, but the last 2 links I shared above really help move things along once you have all the URL equivalents mapped out.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL change - Sitemap update / redirect
Hi everyone Recently we performed a massive, hybrid site migration (CMS, URL, site structure change) without losing any traffic (yay!). Today I am finding out that our developers+copy writers decided to change Some URLs (pages are the same) without notifying anyone (I'm not going into details why). Anyhow, some URLs in site map changed, so old URLs don't exist anymore. Here is the example: OLD (in sitemap, indexed): https://www.domain.com/destinations/massachusetts/dennis-port NEW: https://www.domain.com/destinations/massachusetts/cape-cod Also, you should know that there is a number of redirects that happened in the past (whole site) Example : Last couple years redirections: HTTP to HTTPS non-www to www trailing slash to no trailing slash Most recent (a month ago ) Site Migration Redirects (URLs / site structure change) So I could add new URLs to the sitemap and resubmit in GSC. My dilemma is what to do with old URL? So we already have a ton of redirects and adding another one is not something I'm in favor of because of redirect loops and issues that can affect our SEO efforts. I would suggest to change the original, most recent 301 redirects and point to the new URL ( pre-migration 301 redirect to newly created URL). The goal is not to send mixed signals to SEs and not to lose visibility. Any advice? Please let me know if you need more clarification. Thank you
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Google Bot / SEO and Cookies
Hi, I'm trying to monetise my website via a paid subscription / paid content strategy. The plan is, after the user is on the website for 1min 30secs or clicks 3 map markers or visits 3+ pages A popup will appear asking for a signup + payment Cookie will be set for 6 months, if a users returns and cookie is detected (php) they will be redirected to sign up page My site relies heavily on organic SEO, so my question is: Will google bot be presented with this sign up stuff? Does google bot set cookies? will everything be indexed properly... And what affects on SEO?
Intermediate & Advanced SEO | | thinkLukeSEO0 -
Only 285 of 2,266 Images Indexed by Google
Only 285 of 2,266 Images Indexed by Google. Images for our site are hosted on Amazons CDN cloud based hosting service. Our Wordpress site is on a virtual private server and has its' own IP address. The number of indexed images has dropped substantially in the last year. Our site is for a real estate brokerage firm. There are about 250 listing pages set to "no-index". Perhaps these contain 400 photos, so they do not account for why so few photos have been indexed. The concern is that the low number of indexed images could be affecting overall ranking. The site URL is www.nyc-officespace-leader.com. Is this issue something that we should be concerned about? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Why is a poor optimized url ranked first on Google ?
Hi there, I've been working in SEO for more than five years and I'm always telling clients about the more than 200 factors that influence rankings, but sometimes I meet several urls or websites who haven't optimized their pages nor built links and still appear first. This is the case of the keyword "Escorts en Tenerife" in google.es. If you search that keyword in google.es you'll find this url: escortislacanarias.com... (I don't want to give them a link). My question is why the heck this url is ranking first on Google for that keyword if the url isn't optmized, the page content isn't optimized and hasn't got many or valuable incoming links? Do an on page grader to that url regarding that keyword an it gets an F !!! So there is no correlation between better optimization and good rankings.
Intermediate & Advanced SEO | | Tintanus0 -
How long does google index old urls?
Hey guys, We are currently in the process of redesigning a site but in two phases as the timeline issues. So there will be up to a 4 week gap between the 1st and 2nd set of redirects. These urls will be idle 4 weeks before the phase content is ready. What effect if any will this have on the domain and page authority? Thanks Rob
Intermediate & Advanced SEO | | daracreative0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Why, oh why does Google hate us?
My URL is:
Intermediate & Advanced SEO | | candylotus
www.drupalgeeks.org I have tried my very best to cover all the usual SEO items.... But nothing... We are a legitimate company offering a legitimate service. Ideally we would come up in the results for: "Drupal Developers"
"Drupal Development" and
"Drupal Designers" Yet, we cannot break the top 50 for any of these.... We have:
Optimized Meta Tags
Written Quality Content
Maintained a Social Presence in Twitter, LinkedIn, Facebook, Pinterest, Youtube, and Google Plus
Blogged with some consistency
Guest Blogged
Canonicalization
And dozens of other things And we are all around nice people and a good company....... Why oh why? Can anyone take a look and see if there is something blatantly obvious I am missing? Are we using "drupal" too much?? Thank you in advanced for any assistance. Candice1 -
Why Does Ebay Allow Internal Search Result Pages to be Indexed?
Click this Google query: https://www.google.com/search?q=les+paul+studio Notice how Google has a rich snippet for Ebay saying that it has 229 results for Ebay's internal search result page: http://screencast.com/t/SLpopIvhl69z Notice how Sam Ash's internal search result page also ranks on page 1 of Google. I've always followed the best practice of setting internal search result pages to "noindex." Previously, our company's many Magento eCommerce stores had the internal search result pages set to be "index," and Google indexed over 20,000 internal search result URLs for every single site. I advised that we change these to "noindex," and impressions from Search Queries (reported in Google Webmaster Tools) shot up on 7/24 with the Panda update on that date. Traffic didn't necessarily shoot up...but it appeared that Google liked that we got rid of all this thin/duplicate content and ranked us more (deeper than page 1, however). Even Dr. Pete advises no-indexing internal search results here: http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world So, why is Google rewarding Ebay and Sam Ash with page 1 rankings for their internal search result pages? Is it their domain authority that lets them get away with it? Could it be that noindexing internal search result pages is NOT best practice? Is the game different for eCommerce sites? Very curious what my fellow professionals think. Thanks,
Intermediate & Advanced SEO | | M_D_Golden_Peak
Dan0