When a client clones there UK site copy for a US version....
-
Buonngiorno from 16 degrees C cloudy wetherby UK,
A client has cloned their UK sites copy for a US version. What theyve now got is a USA site and a uK site with exactly the same copy, the only difference is the suffix.
Am i right in saying this will cause problems when for example a browser enters a phrase and two sites appear in the SERPS. Is a solution to this to block the usa site from appearing in the UK (is this possible?).
Yes i know the true fix is to change the copy but we are dealing with clients here
Grazie,
David -
Hey Thanks Oliver, much appreciated (& Ron Too)
-
Hi David,
- Is there different address/telephone numbers on the websites, one UK and one American?
- Is there one website with UK £ for currency and American $?
- In webmaster tools is one set as a US based website and the other UK?
- Does one website have American based links pointing to them and the other UK based links?
We looked into something similar for a client that was wanting the same website in terms of content for 3 different countries and we came across this blog post http://googlewebmastercentral.blogspot.co.uk/2010/03/working-with-multi-regional-websites.html
After reading it we felt that along as the above were different and that the UK website had a UK IP hosting address and the American one had an American hosting IP address we felt it would be ok.
Would be very interested to hear from other people's experiences with regards to this in case we have got it wrong.
Hope that this helps.
-
Yes you are right. Panda specifically penalizes sites with duplicate content. You need to have at least 65% original content on each page to avoid this penalty. I would suggest that you localize the UK page with references to the area and integrate local terminology, spelling and slang to address part of this problem. Hopefully this will create enough of a rewrite to make the content essentially unique.
As far as the client goes I am assuming they are trying to save some money because they don't understand the value of doing things correctly. You may want to look at ways to monetize the traffic they are already getting and might lose or better yet show them the traffic they are losing to their competitors. A few ways you might consider putting this value into real terms is to equate the cost of clicks to the average cost of a paid click for the same terms. You might also want to look at the actual value of each new customer over their entire customer life. For example a chiropractor might only get $76 for an individual visit but might get $8,000 over their average customer cycle. If the customer never hits the site because of bad content or SEO it did not cost your client $76 it cost them $8,000 per customer lost. If the site get 1,000 visits with a .002 conversion that is really two customers and $16000 revenue lost. Usually when people look at their traffic in these terms spending the money to do it right makes more sense.
I hope this helps,
Ron
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google cache is for a 3rd parties site for HTTP version and correct for HTTPS
If I search Google for my cache I get the following: cache:http://www.saucydates.com -> Returns the cache of netball.org (HTTPS page with Plesk default page) cache:https://www.saucydates.com -> Displays the correct page Prior to this my http cache was the Central Bank of Afghanistan. For most searches at present my index page is not returned and when it is, it’s the Net Ball Plesk page. This is, of course hurting my search traffic considerably. ** I have tried many things, here is the current list:** If I fetch as Google in webmaster tools the HTTPS fetch and render is correct. If I fetch the HTTP version I get a redirect (which is correct as I have a 301 HTTP to HTTPS redirect). If I turn off HTTPS on my server and remove the redirect the fetch and render for HTTP version is correct. The 301 redirect is controlled with the 301 Safe redirect option in Plesk 12.x The SSL cert is valid and with COMODO I have ensured the IP address (which is shared with a few other domains that form my sites network / functions) has a default site I have placed a site on my PTR record and ensured the HTTPS version goes back to HTTP as it doesn’t need SSL I have checked my site in Waybackwhen for 1 year and there are no hacked redirects I have checked the Netball site in Waybackwhen for 1 year, mid last year there is an odd firewall alert page. If you check the cache for the https version of the netball site you get another sites default plesk page. This happened at the same time I implemented SSL Points 6 and 7 have been done to stop the server showing a Plesk Default page as I think this could be the issue (duplicate content) ** Ideas:** Is this a 302 redirect hi-jack? Is this a Google bug? Is this an issue with duplicate content as both servers can have a default Plesk page (like millions of others!) A network of 3 sites mixed up that have plesk could be a clue? Over to the experts at MOZ, can you help? Thanks, David
Intermediate & Advanced SEO | | dmcubed0 -
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
How to avoid adult traffic to site?
A client of ours is increasingly getting a lot of adult traffic to their site, where they show up only for adult searches and not at all for relevant searches. How can we stop Google associating their site with adult content? Here's a blog example, giving advice to parents on girls and body image issues: https://www.commonsensemedia.org/blog/girls-and-body-image keywords driving traffic to this page are all around images for 'young nude girls' etc.
Intermediate & Advanced SEO | | MediaCause0 -
SEO question regarding rails app on www.site.com hosted on Heroku and www.site.com/blog at another host
Hi, I have a rails app hosted on Heroku (www.site.com) and would much prefer to set up a Wordpress blog using a different host pointing to www.site.com/blog, as opposed to using a gem within the actual app. Whats are peoples thoughts regarding there being any ranking implications for implementing the set up as noted in this post on Stackoverflow: "What I would do is serve your Wordpress blog along side your Rails app (so you've got a PHP and a Rails server running), and just have your /blog route point to a controller that redirects to your Wordpress app. Add something like this to your routes.rb: _`get '/blog', to:'blog#redirect'`_ and then have a redirect method in your BlogController that simply does this: _`classBlogController<applicationcontrollerdef redirect="" redirect_to="" "url_of_wordpress_blog"endend<="" code=""></applicationcontrollerdef>`_ _Now you can point at yourdomain.com/blog and it will take you to the Wordpress site._
Intermediate & Advanced SEO | | Anward0 -
Article section on site or blog?
So, I've just started using MOZ since I've decided I wanna be an "expert" in SEO.
Intermediate & Advanced SEO | | KasperGJ
I run a couple of successful websites in Denmark and I've had some SEO guy do some SEO a few years back, but now I wanna learn this myself. I've already read a lot of books, blogs on the subject and talked with several SEO "experts". Anyways, I have a concrete "problem" which I need some help on deciding what to do. Its the same issue / dilemma on all my sites. Dilemma
On my site i have a menu-section called Articles and tips. As the name implies it's basically articles and tips on subjects related to the site.
The articles are both informal for the users and I also use these to attract new users on specific keywords.
The articles are not "spam" articles or quickly made articles, the actually give good information to the users and are wellwritten and so. I've hired a girl to create more articles, so there will be a good flow on articles, interviews and so on soon. Some SEO guys tells me, that I should create and use a external blog "instead" and post the articles there instead of on my site. (ex www.newsiteblog.com) And another SEO guy tells me that I should run a blog on my own site (ex www.ownsite.com/blog) , where I post the articles. I have a really hard time deciding what is the best way, since I hear all kinds of ideas, and really dont know who to trust. My own idea is, that it seems "stupid" to take content from the site and put on external blog.
Then I would also have to create a new blog, and point links from that to my site and so. Any of you guys have any ideas? Sorry for my bad english.0 -
Implications from portfolio site
I'm looking for a bit of advice regarding links coming into main site from another site in the client portfolio. The main site we are working on has been going great, organic traffic has grown considerably. The past few weeks there has been a subtle decline including ranking for a few keywords down a little. What I have noticed is that there is another site in the portfolio (that I am not working on) has had a steady tailspin in organic traffic since Jan and i've been informed it is a dying site in terms of the products offered. This has some links in the main menu going directly to the main site. My gut feeling is to isolate the secondary site from the main (no-follow or remove links), but the impact on slightly dropped rankings on the main site is not directly related to those linked pages. Would you go for it and isolate anyway?
Intermediate & Advanced SEO | | MickEdwards0 -
Why Did My Site Go Limp On Me?
One of my clients was once in the #1 position for "Philadelphia interior designer" and other related terms, but her site has dropped significantly. Still it is on the first page, but far from its former glory. http://www.interiorsbydonnahoffman.com is the site. What really confuses me is why in her home turf search of "Bucks County Interior Designer" a competitor, http://www.miriamansellinteriors.com, is above her in the SERPS. According to OSE her competitor has a PA of 32 vs my client's 39. My client has 35 Linking Root Domains (and some of high quality) compared to just 11 for the competition. In all aspects her competitor looks weaker and less relevant to me. Her site has been weak in the SERPs since May/June. We are redesigning her site- she has a high bounce rate compared to my other interior design clients, something like 55%. Any insights from y'all?
Intermediate & Advanced SEO | | dfhytrwy0 -
Client site is lacking content. Can we still optimize without it?
We just signed a new client whose site is really lacking in terms of content. Our plan is to add content to the site in order to achieve some solid on-page optimization. Unfortunately the site design makes adding content very difficult! Does anyone see where we may be going wrong? Is added content really the only way to go? http://empathicrecovery.com/
Intermediate & Advanced SEO | | RickyShockley0