How can I penalise my own site in an international search?
-
Perhaps penalise isn't the right word, but we have two ecommerce sites.
One at .com and one at .com.au.
For the com.au site we would like only that site to appear for our brand name search in google.com.au.
For the .com site we would like only that site to appear for our brand name search in google.com.
I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site.
What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site.
Is there anything I can do to lower the ranking of my .com site in Google.com.au?
-
One of the examples scenarios Google gives is:
Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland.
Tough call, you might have to do some research to see if this solution will help in your particular scenario.
-
They aren't identical, they have a different design, text, almost everything.
They are similar. As in they are both book stores.
The .com.au has Australian wording / spelling, the .com has English spelling and wording.
Do we need to specify hreflang="en-au" if they are different sites?
-
Are the sites identical but just hosted on different domains to target different regions?
Is there any variation in the English used on each site, for example, do you have Australian English spelling on the .com.au and US (or other) English on the .com?
If yes, you might want to have a look into the rel="alternative" hreflang="x" meta tags.
Checkout: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Especially the Example configuration: rel="alternate" hreflang="x" in action section
-
Thanks Mat, that definitely sounds wise.
Penalise was definitely the wrong word, I more meant, what other signals can we put out to Google to say that this is the com.au site and we want this to appear above the com.
-
I'd be ever so careful about doing anything to deliberately try to lower you ranking. It just sounds like an approach that could go horribly wrong.
You best bet might be to live with the fact that both will appear (or better still - enjoy and encourage it), but use the sites to achieve the end goal of getting users on to the correct site.
The usual way to do this would be to check the IP address of the user against a geoip database. I've used both the paid and free versions of the database available at maxmind.com for this. That will allow you to identify users that are in Australia and direct them towards to .au site.
How you direct them is important. You could just automatically redirect those users to the new site. Some people will say that this can look like cloaking and cause issues, but I don't believe that alone will do this. However it is often better to intercept those users with a message along the lines of "It looks like you are connecting from Australia - would you like to view our dedicated Australia website?" - then list the benefits and offer a choice there.
If you do that it would be good to set a custom variable in analytics to know when that message had been shown. That would allow you to measure how many people are following the suggestion.
Once you are happy it is working then you will probably end up encouraging both domains to appear as dominating the SERP for your brand is always useful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
International SEO: can I choose only certain pages for subfolder?
For a client we are discussing international SEO options. I have pushed against CCTLD because they do not have the resources to manage multiple sites. Instead we want to go the subfolder route: .com/uk/ My question is whether we can properly create a subfolder version that only includes a handful of pages rather than the whole site - so 5-10 pages vs 4k. Is that possible? I'd love your thoughts. International SEO is not my strong suit. Also - if the subfolder for .com/uk/ content is almost entirely the same as the us-based .com is that a problem? Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
Moving multiple Sites to One Site and SEO Impact/Ideas
Hi there, We are in the process of moving 2 sites with higher page authority to another site we own (that is our company brand), so essentially 3 sites into one. We're at risk of losing a lot of SEO from the original 2 sites that have all the product information. We are doing this since we merged companies a couple years back and need one web precense. Anyhow, the site launch date is in 3 months and the recommendation is to start moving content over prior to that for top pages, which is a big undertaking when we are launching all the pages again with new content, redeisgn and moving sites in 3 months. If it's the right move, we should do it, but I just wanted to get opinions on how others have handled something similiar when moving to a site with lower site authority and trying not to lose rankings.
Intermediate & Advanced SEO | | lauramrobinson320 -
Recommended e-commerce site search for Magento?
Does anyone have recommendations for any particular site searches for large e-commerce sites based on Magento? Some (hopeful) requirements: Possibility to segment product pages and blog content on results page Doesn't cause any major SEO or technical issues Understands semantic search Ability to filter results Ability to sort (e.g. by price, popularity, new in stock) It'd be really useful to see examples and know if there are any particular issues to be aware of. Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Backlinking 3 sites from same domain and backlinking main site too
Hello, we have 4 sites, in which 1 is a main site and rest 3 are niche sites All these 3 sites have dofollow links to main site from home page We got a high quality backlink - through which all 3 niche sites have got it from that domain Is it worth to add backlink from that domain to main site too, despite the fact the 3 sites already have recvd it and they all link to main site many thanks
Intermediate & Advanced SEO | | Modi0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1