Client has 3 websites, for various locations & duplicate content is a big issue...Is my solution the best?
-
Hi guys,
I have a client who has 3 websites all for different locations in the same state in Australia.
Obviously this is not the best practice but in the meeting he said that each area is quite particular about where they do business. What he means is that people from one area want to do business with a website from that particular area.
He has 3 domains and we have duplicate content issues. We are solving these at the moment with the canonical tag however they are redesigning the site soon.
My suggestion is that we have 1 domain and sub domains for the other 2 areas. This way the people from that area will see the company is from their area.
Also this way we have 1 domain to optimise and build domain authority for.
Has anyone else come across this and is my solution the best for this?
Thanks!
Jon
-
HI Sanket,
The poster is not talking about international content, but three locations within the same state.
If you quote from another source, it's helpful to let the community know the source of the post so they can find more information (and is a courtesy to the blog author as well). For more information about the first main paragraph, http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world is the source article from Dr. Pete.
As for spinning content, I would not advise spun content.
-
hi,
There are two solution for the Local SEO :
If you want to create pages for targeting geographic location then you have to post unique content for every geographic region and If you aren’t willing to make that investment, then don’t create the pages. They’ll probably backfire. This site is best example for your problem: http://bandelal.com/
-
I second this. I have dealt with clients that have done the same thing. Have one domain (ie companyname.com) and use the footer to display the contact info for each location. Try not to promote any office more than the other and there shouldn't be an issue. In my experience the visitors will not care.
Do really awesome local SEO and set up places pages for each location, etc.
-
How would this client feel about a single website with a top banner that proclaimed:
Stores in Sydney, Canberra and New Castle!
I am willing to bet one month's pay that a single site will be more successful than three separate sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Company blog. What are the best solutions?
Hello Moz Community! Our company has its own blog (www.awarablogs.com) - the blog was created some time ago by means of a simple blog-engine. Now we see that the structure of the blog is bad for SEO (it has long URLs, many useless folders, subdomains and so on), so we'd like to simplify it. But the engine doesn't allow to change its structure in the way we 'd like to. Our webmaster suggested that we use "Alias". Will this method really help us make our blog SEO-friendly? Or is it better to choose another blog software like Wordpress? Thank you very much!
Technical SEO | | Awaraman0 -
Duplicate content in Magento
Hi all We got some serious issues with duplicate content on a Magento site that we are marketing. For example: http://www.citcop.se/varmepumpar-luft-luft/panasonic/panasonic-nordic-ce9nke-5-0kw http://www.citcop.se/panasonic/panasonic-nordic-ce9nke-5-0kw http://www.citcop.se/panasonic-nordic-ce9nke-5-0kw All of the above seem to work just fine as it is now but since they are excatly the same product they should ofcourse do a 301 redirect to the main page. Any ideas on how to sort this out in Magnto without having to resort to manual work in .htaccess? Have a great day Fredrik
Technical SEO | | Resultify0 -
How to prevent duplicate content at a calendar page
Hi, I've a calender page which changes every day. The main url is
Technical SEO | | GeorgFranz
/calendar For every day, there is another url: /calendar/2012/09/12
/calendar/2012/09/13
/calendar/2012/09/14 So, if the 13th september arrives, the content of the page
/calendar/2012/09/13
will be shown at
/calendar So, it's duplicate content. What to do in this situation? a) Redirect from /calendar to /calendar/2012/09/13 with 301? (but the redirect changes the day after to /calendar/2012/09/14) b) Redirect from /calendar to /calendar/2012/09/13 with 302 (but I will loose the link juice of /calendar?) c) Add a canonical tag at /calendar (which leads to /calendar/2012/09/13) - but I will loose the power of /calendar (?) - and it will change every day... Any ideas or other suggestions? Best wishes, Georg.0 -
How can you avoid duplicate content within your own e-commerce website
One of the e-commerce websites I am working on is giving me a lot of duplicate content errors because all of the products are the same, just different sizes. Does anyone have any ideas how to fix this problem or should i just ignore it? Someone in the office brought up the idea to just use an i frame for all product descriptions. Any thoughts would be much appreciated.
Technical SEO | | DTOSI0 -
How to Solve Duplicate Page Content Issue?
I have created one campaign over SEOmoz tools for my website. I have found 89 duplicate content issue from report. Please, look in to Duplicate Page Content Issue. I am quite confuse to resolve this issue. Can any one suggest me best solution to resolve it?
Technical SEO | | CommercePundit0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0