Geo Domains & SEO Issues
-
Hi
Is there issues to duplicate content on geo specific domains. For example we have a client with a .co.uk site who wants to create a .ie website for the local market, rather than sending them to the UK site. The content would be duplicated with limited customization. What issues would you for-see and how could they be overcome?
Thank you
-
I wouldn't have exact content. If it is just one website use the same site structure but change the content. You can vary your content and messaging, but portray the same theme. Google doesn't like duplicate content anyway you look at it. Just my opinion though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Question - Are 503/504 errors an issue?
Lately I've noticed more and more 503/504 errors being flagged in my MOZ reports. One week I had over 1300 errors show up. I checked Google Webmaster Tools and Bing Webmaster tools and noticed they were showing up in there too, although not near as many (50 or less per day). I contacted my hosting company about it and they said these were normal and that it was due to one nameserver reaching capacity, but that there was a backup nameserver that kicks in. I've seen one or two of these errors show up before, but never more than one or two a week. Is this something I should be concerned about?
Technical SEO | | Kyle Eaves0 -
Parked Domains
I have a client who has a somewhat odd situation for their domains. They've been really inconsistent with how they've used them over the years, which makes for a slightly sticky situation. The client has two domains: compname.com and fullcompanyname.com. Right now, their website is just HTML (no CMS) and all of the URLs are relative, so both domains work. Since the new website will be in WordPress, they need to commit to one domain as the primary. Right now, it looks like compname.com is the one they've used the most in ads and such, so I'm going to recommend they go with that. However, the client has also used fullcompanyname.com a lot. They don't want to have to setup individual 301 redirects for everything. I think it's ridiculous, but you can lead a horse to water... Our developer has done some research and he may have found a solution that will satisfy the client. I just want to find out if there are any SEO implications. The possible plan is to us compname.com as the primary domain and to park fullcompanyname.com. That way, if someone visits fullcompanyname.com/products/my-favorite-product, it will still work without having to setup 301 redirects. Since the domain is parked, Google won't recognize it as duplicate content, correct? Just to be clear on the whole situation, I'm insisting that all of the website URLs need 301 redirects, regardless of the domain. The primary concern is with a lot of other stuff on the server that isn't related to the site (email campaign landing pages, image files, assets that are pulled in by the client's software, etc.). The client's concern is about redirecting all that other stuff (and there is a lot of it--thousands of files). The parked domain would seem to fix that, but I want to make sure that the client won't get Google slapped.
Technical SEO | | BopDesign0 -
A/B testing entire website VS Seo issues
I'm familar with A/B testing variations of a page but I'd like to A/B test a new designs version of a e-commerce site. I´m wondering about the best way to test with SEO concerns... this is what I´ve in mind right now, any suggestion? Use parameters to make version B different from A version. Redirect 50% of the users with 302 ( or javascript would be a better way?) Use noindex in the B pages. Use rel=canonical in the B pages pointing to A version. In the end use 301 redirect to all B pages to A urls. PS: We can´t use subdomain and i don´t wanna use robots.txt file to protect the new design from competitors. I´d love any suggestions and tips about it - thanks folks 🙂
Technical SEO | | SeoMartin10 -
What is cross domain?
what is cross domain? can any one explain in simple language ?
Technical SEO | | constructionhelpline0 -
Blog.domain.co.uk or domain.co.uk/blog
Hi Guys, I'm just wondering which offers more SEO value and which is easier to set up out of: blog.domain.co.uk domain.co.uk/blog Thanks, Dan
Technical SEO | | Sparkstone0 -
301 redirects and seo..
I bought a domain and it has nice traffic. It only has about 5 main pages in php When i got the site i switched to html because php was overkill. I did the 301 and google deleted the php files and replaced with html version when i check site:domain.com It has been about 7 days. I DID NOT use 301 for each of the 5 pages to go php to html instead is used this code RewriteEngine On
Technical SEO | | samerk
RewriteCond %{HTTP_HOST} ^mydomain.com
RewriteRule (.) http://www.mydomain.com/$1 [R=301,L]
RedirectMatch 301 (.).php$ http://www.mydomain.com$1.html So basically if you load php it will load the html version. dog.php > dog.html Is this OKAY? or should it be done differently.... worried! Thanks !0 -
Duplicate Content Issue with
Hello fellow Moz'rs! I'll get straight to the point here - The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate. We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or? Thank you all and happy rankings, James
Technical SEO | | JamesPiper0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0