How can I penalise my own site in an international search?
-
Perhaps penalise isn't the right word, but we have two ecommerce sites.
One at .com and one at .com.au.
For the com.au site we would like only that site to appear for our brand name search in google.com.au.
For the .com site we would like only that site to appear for our brand name search in google.com.
I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site.
What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site.
Is there anything I can do to lower the ranking of my .com site in Google.com.au?
-
One of the examples scenarios Google gives is:
Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland.
Tough call, you might have to do some research to see if this solution will help in your particular scenario.
-
They aren't identical, they have a different design, text, almost everything.
They are similar. As in they are both book stores.
The .com.au has Australian wording / spelling, the .com has English spelling and wording.
Do we need to specify hreflang="en-au" if they are different sites?
-
Are the sites identical but just hosted on different domains to target different regions?
Is there any variation in the English used on each site, for example, do you have Australian English spelling on the .com.au and US (or other) English on the .com?
If yes, you might want to have a look into the rel="alternative" hreflang="x" meta tags.
Checkout: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Especially the Example configuration: rel="alternate" hreflang="x" in action section
-
Thanks Mat, that definitely sounds wise.
Penalise was definitely the wrong word, I more meant, what other signals can we put out to Google to say that this is the com.au site and we want this to appear above the com.
-
I'd be ever so careful about doing anything to deliberately try to lower you ranking. It just sounds like an approach that could go horribly wrong.
You best bet might be to live with the fact that both will appear (or better still - enjoy and encourage it), but use the sites to achieve the end goal of getting users on to the correct site.
The usual way to do this would be to check the IP address of the user against a geoip database. I've used both the paid and free versions of the database available at maxmind.com for this. That will allow you to identify users that are in Australia and direct them towards to .au site.
How you direct them is important. You could just automatically redirect those users to the new site. Some people will say that this can look like cloaking and cause issues, but I don't believe that alone will do this. However it is often better to intercept those users with a message along the lines of "It looks like you are connecting from Australia - would you like to view our dedicated Australia website?" - then list the benefits and offer a choice there.
If you do that it would be good to set a custom variable in analytics to know when that message had been shown. That would allow you to measure how many people are following the suggestion.
Once you are happy it is working then you will probably end up encouraging both domains to appear as dominating the SERP for your brand is always useful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Using "nofollow" internally can help with crawl budget?
Hello everyone. I was reading this article on semrush.com, published the last year, and I'd like to know your thoughts about it: https://www.semrush.com/blog/does-google-crawl-relnofollow-at-all/ Is that really the case? I thought that Google crawls and "follows" nofollowed tagged links even though doesn't pass any PR to the destination link. If instead Google really doesn't crawl internal links tagged as "nofollow", can that really help with crawl budget?
Intermediate & Advanced SEO | | fablau0 -
Best way to do site seals for clients to have on their sites
I am about to help release a product which also gives people a site seal for them to place on their website. Just like the geotrust, comodo, symantec, rapidssl and other web security providers do.
Intermediate & Advanced SEO | | ssltrustpaul
I have notices all these siteseals by these companies never have nofollow on their seals that link back to their websites. So i am wondering what is the best way to do this. Should i have a nofollow on the site seal that links back to domain or is it safe to not have the nofollow.
It wont be doing any keyword stuffing or anything, it will probly just have our domain in the link and that is all. The problem is too, we wont have any control of where customers place these site seals. From experience i would say they will mostly likely always be placed in the footer on every page of the clients website. I would like to hear any and all thoughts on this. As i can't get a proper answer anywhere i have asked.0 -
On-site Search - Revisited (again, *zZz*)
Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.
Intermediate & Advanced SEO | | Mark_Elton0 -
Can multiple geotargeting hreflang tags be set in one URL? International SEO question
Hi All, I have a question please. If i target www.onedirect.co.nl/en/ in English for Holland, Belgium and Luxembourg, are the tags below correct? English for Holland, Belgium and Luxembourg: http://www.example.co.nl/en/" hreflang="en-nl" /> http://www.example.co.nl/en/" hreflang="en-be" /> http://www.example.co.nl/en/" hreflang="en-lu" /> AND Targeting Holland and Belgium in Dutch: Pour la page www.onedirect.co.nl on peut inclure ce tag: http://www.example.co.nl" hreflang="nl-nl" /> http://www.example.co.nl" hreflang="nl-be" /> thanks a lot for your help!
Intermediate & Advanced SEO | | Onedirect_uk0 -
Google Custom Searches with site CSS
Anyone good with GCS. I want to add Google custom searches in my site but with my site CSS.
Intermediate & Advanced SEO | | csfarnsworth
I need results from GCS but want to display with my website CSS. Website is in OSCommerce and php.0 -
Should I noindex the site search page? It is generating 4% of my organic traffic.
I read about some recommendations to noindex the URL of the site search.
Intermediate & Advanced SEO | | lcourse
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales). My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead. Would you noindex this page or not? Any thoughts?0 -
Should I move our blog internal....
I wanted to also ask the wider moz community this question. Our blogs are currently run on blogger/wordpress using a subdomain strategy - blog.website.com and has now gained a home page PR3. It's been running for 2-3 years. This runs contrary to best practice of website.com/blog. I'm now considering making the blog internal but want to get your opinion as the longer I leave it, the bigger a decision it will be.... Do the pro's of making the blog internal outweigh the cons of doing so ? Pro's Blog benefits from root domain Fresh content on the site that people can interact with Root domain benefits from links the content gains Easier to analyse user activity Con's Loss of Page Rank Effort to 301 all URL's and content CMS altered to allow creation of blog content
Intermediate & Advanced SEO | | RobertChapman0