How to take out international URL from google US index/hreflang help
-
Hi Moz Community,
Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website.
The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage.
I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search.
Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue.
Thanks,
-Reed -
Hi Sheena, sorry I didn't respond sooner, I wasn't receiving any notifications.
Thank you very much for your answer though, this was extremely helpful and helped verify that what I was thinking was correct, with some added help from you.
I didn't think taking away the 301 was the best approach, but from a bosses standpoint he sees it as them getting clicks that shouldn't be theirs, I just have to do my best job of explaining why it's better for long term.
The hreflang is in place and I think the best approach would be to consolidate international domains to the .com ccTLD's
Thanks again, very helpful.
-Reed -
I'm working on a very similar scenario, where .com.au pages are ranking in Google US and .com pages are ranking in Google AU (above .com.au pages).
We are moving forward with the hreflang attribute since it was specifically introduced to help search engines serve the correct language or regional URL to searchers. In helping search engines index and serve the localized version of your content, “hreflang” also prevents duplicate content penalties by telling Google that each potential “duplicate” is actually an alternative for users who require an alternate language version. * We see this as a short-term goal, as we plan to eventually consolidate the ccTLDs to the .com site.
Here are some international SEO / hreflang resources that might help:
- https://support.google.com/webmasters/answer/189077?hl=en
- http://moz.com/blog/hreflang-behaviour-insights
- http://moz.com/blog/the-international-seo-checklist
- Anything from Aleyda Solis &/or Gianluca Fiorelli
- http://moz.com/blog/using-the-correct-hreflang-tag-a-new-generator-tool
- http://www.themediaflow.com/tool_hreflang.php
Also, since the AU subdomain pages were ranking well, I probably would have left the redirect in place rather than let it go to a 404. Then focus on mapping out the equivalents between the .com and .com.au sites. This is a very tedious project, but the last 2 links I shared above really help move things along once you have all the URL equivalents mapped out.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallow: /jobs/? is this stopping the SERPs from indexing job posts
Hi,
Intermediate & Advanced SEO | | JamesHancocks1
I was wondering what this would be used for as it's in the Robots.exe of a recruitment agency website that posts jobs. Should it be removed? Disallow: /jobs/?
Disallow: /jobs/page/*/ Thanks in advance.
James0 -
Google not Indexing images on CDN.
My URL is: http://bit.ly/1H2TArH We have set up a CDN on our own domain: http://bit.ly/292GkZC We have an image sitemap: http://bit.ly/29ca5s3 The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: http://bit.ly/29eNSXv. We used to have a disallow to /thumb/ which had a 301 redirect to our CDN but we removed both the disallow in the robots.txt as well as the 301. Yet, GWT still reports none of our images on the CDN are indexed. The above screenshot is from the GWT of our main domain.The GWT from the CDN subdomain just shows 0. We did not submit a sitemap to the verified subdomain property because we already have a sitemap submitted to the property on the main domain name. While making a search of images indexed from our CDN, nothing comes up: http://bit.ly/293ZbC1While checking the GWT of the CDN subdomain, I have been getting crawling errors, mainly 500 level errors. Not that many in comparison to the number of images and traffic that we get on our website. Google is crawling, but it seems like it just doesn't index the pictures!? Can anyone help? I have followed all the information that I was able to find on the web but yet, our images on the CDN still can't seem to get indexed.
Intermediate & Advanced SEO | | alphonseha0 -
Old/wrong meta-titles in index
Hi, We have problems with old Meta titles in the index of google.nl. If you look for example at this wine: https://www.wijnvoordeel.nl/Italie/Just-Hugo::5460.html The Meta tile is: **Just Hugo | Heerlijke Hugo | Het zomerdrankje van 2014 | Wijnvoordeel ** If you look at the results in Google: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#safe=active&q=just hugo The Meta tile is: Just Hugo - Wijnvoordeel(this is an old/automatic generated Meta tile). I already added the code "", but I don't see any progress. Does anybody knows what could be the problem? Thanks for the help! Douwe Veldstra
Intermediate & Advanced SEO | | Eluscious.com0 -
Google Indexed my Site then De-indexed a Week After
Hi there, I'm working on getting a large e-commerce website indexed and I am having a lot of trouble.
Intermediate & Advanced SEO | | Travis-W
The site is www.consumerbase.com. We have about 130,000 pages and only 25,000 are getting indexed. I use multiple sitemaps so I can tell which product pages are indexed, and we need our "Mailing List" pages the most - http://www.consumerbase.com/mailing-lists/cigar-smoking-enthusiasts-mailing-list.html I submitted a sitemap a few weeks ago of a particular type of product page and about 40k/43k of the pages were indexed - GREAT! A week ago Google de-indexed almost all of those new pages. Check out this image, it kind of boggles my mind and makes me sad. http://screencast.com/t/GivYGYRrOV While these pages were indexed, we immediately received a ton of traffic to them - making me think Google liked them. I think our breadcrumbs, site structure, and "customers who viewed this product also viewed" links would make the site extremely crawl-able. What gives?
Does it come down to our site not having enough Domain Authority?
My client really needs an answer about how we are going to get these pages indexed.0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
Is Google taking longer to rank new sites?
We run a lot of "niche blogs" and websites focused on fairly non-competitive keywords. At the start of the year, we used to be able to put up websites and be able to achieve almost instant rankings on these sites. However, recently, it seems to be taking a lot longer for these sites to rank. It also seems to be taking longer for Google to index links. Is this a recent change in Google to protect against spam and help filter out the lower quality sites? Has anyone else noticed this or is it just me?
Intermediate & Advanced SEO | | ukss19840 -
My website keywords have been almost completely taken out of indexing in Google since 04/26/11 and I cannot determine why, anyone know?
I had 12 to 15 1st page Google rankings in the iPhone, iPad, app review vertical. As of 04/26/11 I have lost all rankings, traffic has gone from 1,000 to 1,200 a day to 150 to 350 a day. I was using a plugin for auto press releases, but have removed this and deleted the urls. I also have changed themes and hosting over the last 3 weeks. I have been trying to get SEO help, but cannot seem to get anyone to help me. thank you Mike
Intermediate & Advanced SEO | | crazymikesapps1