Cant get my head around this duplicate content dilemma!
-
Hi,
Lets say you have a cleaning company, you have a services page, which covers window cleaning, carpet cleaning etc, lets say the content on this page adds up to around 750 words.
Now lets say you would like to create new pages which targeted location specific keywords in your area.
The easiest way would be to copy the services page and just change all the tags to the location specific term but now you have duplicate content.
If I wanted to target 10 locations, does this now mean I need to generate 750 words of unique content for each page which is basically the services page rewritten?
Cheers
-
That's great, Activitysuper,
Just stage the project in a reasonable manner. Your copywriter can't do it all at once, but he/she can do it over time. Good luck!
Miriam
-
Yeah I did find this very helpful, always good to know how someone has actually tackled this problem, it also reassures im not actually being silly and there is a better way of doing it.
Looks like unique for each page is the only way to go.
The only difference is I might add say 200 words for each page from start and then add 100 words more each month, I think this might make it easier to write more in each location.
I got a copywriter, which is half the battle done for me.
-
Thanks, Alan. Glad you found this helpful. I hope ActivitySuper will, too.
-
Thank you Miriam, this is excellent advice, thank you for taking the time to do it. I wish I could give you 10 thumbs up!
-
Hi ActivitySuper!
Thanks for coming to Q&A with what is actually very important question.I sympathize with your puzzlement here because I hear many local business owners saying the same thing - what am I supposed to write about?
Members here are giving you good advice - you've either got to be ready to make the effort/investment, or be satisfied with simply mentioning your services and locations and cross your fingers. If you are the only game in town, that might get you somewhere, but if you've got even 1 local competitor, such an approach will not lead to the dominance that you no doubt seek.
Here is what I do for my clients (some of who, coincidentally, are carpet cleaning companies!) This advice is given with the understanding that, like most business owners in the cleaning industries, you have one actual physical location but serve within a variety of neighboring cities. If that's correct, read on. If that isn't correct and you've got multiple physical offices, let me know.
1. Implement the major local hooks on the website for the physical location - Google is always going to see you as most relevant to your city of location, not other cities within the service radius. In addition to doing the on-site Local SEO, get the business properly listed with a violation-free Google Place Page and other local business directory listings.
2. Create a list of your 5-10 main services. Make a menu on the site of a page for each of these services, optimized for the services + your city of location. The content must be good, strong and unique.
3. Create a list of your 5-10 main service cities. Create a city landing page for each of these cities (including your city of location) creating an overview of your work in each city on each page. Make a menu of these pages on the site. Again, the content must be good, strong and unique. No cutting and pasting!
At this point you will have developed 10-20 pages of unique, creative content for your website. Depending on the competitiveness of your industry in your region, this may get you enough rankings to satisfy you and get phones ringing. However, in most cases, you will want to do more. Move on to step 4.
4. Now, create a big list of all possible combinations. This might look like:
Carpet Cleaning City A
Carpet Cleaning City B
Carpet Cleaning City C
Window Cleaning City A
Window Cleaning City B
Window Cleaning City C
Tile and Grout Cleaning City A
etc.Create a timeline for beginning to write articles over a set number of months to cover each of these phrases. You're not going to do this all at once. My clients have most typically requested that I do anywhere from 3-10 articles a month for them. A blog is terrific for this sort of thing, by the way. If the client has hired me to do 10 articles a month, in 3 months, we've covered 30 terms, in 6 months, we've covered 60 terms, etc.
The client has to participate in this. If he simply paid some penny copywriter to write a bunch of boring, generic content for this large number of terms, chances are, he wouldn't end up with a very pleasant or persuasive website. Rather, he needs to be photographing his projects in the different cities and coming to me with photos, testimonials from clients in the service cities, anecdotes and what have you. I take this, combine it with a solid knowledge of the city and the service/products used, find some other photographs and maybe maps and turn each article into a very solid piece of content. The approach is quite authentic and results in the ongoing creation of an ever-growing library of content about the client's work in each of his cities.
Remember, the whole point of this approach is to obtain secondary visibility (typically organic) for terms outside of his city of location. It should be seen as an ongoing project, and I've seen this approach work time and again for my clients.
You're at an important point of decision right now. You need to decide if you have the creativity and time to do this right on your own, hire a Local SEO-skilled Copywriter to do it for you or if you just can't do either. Sincerely wishing you luck!
Miriam -
I agree, Alan. No matter how hard you try it is going to carry some level of dup - you would be better off trying to target all 10 locations on the main services page than trying to re-spin the same 750 words. Your first suggestion is the approach I would take as well.
-
What you just said is only true if you have nothing to write about.
I should have made it clear that I only advocate doing that if you have something to say that is meaningful.
If you don't, then don't do it - If you have something relevant and useful to say, it is better than repeating the same information, whether you rewrite it or not.
Spinning the 750 word story into 10 different versions is a really bad idea, in my opinion.
-
Yeah that is a option but your now looking at creating semi-relevant content (that's even if you can find something to write about for each location that is semi-relevant).
But your reply is an option so thanks.
-
Only if you want them to be indexed.
Alternatively, you can write 150 to 200 words that apply specifically to the location and link off to the original page with 750 words of stunning content.
For example, here is an idea:
_ Window Cleaning and Carpet Cleaning in Murfreesboro _
Murfreesboro and surrounding areas sometimes presents a problem for carpet cleaners, because of the high incidence of termites. These termites..... blah blah blah.
....
Our Murfreesboro carpet cleaning crews are all locals, so they have an intimate knowledge of ... blah blah blah.
....
Read about how our carpet cleaning service fits your unique needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL slash creating duplicate content
Hi All, I currently have an issue whereby by domain name (just homepage) has: mydomain.com and: mydomain.com/ Moz crawler flags this up as duplicate content - does anyone know of a way I can fix this? Thanks! Jack
Technical SEO | | Jack11660 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Uservoice and Duplicate Page Content
Hello All, I'm having an issue where the my UserVoice account is creating duplicate page content (image attached). Any ideas on how to resolve the problem? A couple solutions we're looking into: moving the uservoice content inside the app, so it won't get crawled, but that's all we got for now. Thank you very much for your time any insight would be helpful. Sincerely,
Technical SEO | | JonnyBird1
Jon Birdsong SalesLoft duplicate duplicate0 -
Canonical usage and duplicate content
Hi We have a lot of pages about areas like ie. "Mallorca" (domain.com/Spain/Mallorca), with tabbed pages like "excursion" (domain.com/spain/Mallorca/excursions) and "car rental" (domain.com/Spain/Mallorca/car-rental) etc. The text on ie the "car rental"-page is very similar on Mallorca and Rhodos, and seomoz marks these as duplicate content. This happens on "car rental", "map", "weather" etc. which not have a lot of text but images and google maps inserted. Could i use rel=nex/prev/canonical to gather the information from the tabbed pages? That could show google that the Rhodos-map page is related to Rhodos and not Mallorca. Is that all wrong or/and is there a better way to do this? Thanks, Alsvik
Technical SEO | | alsvik0 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Why do I get duplicate content errors just for tags I place on blog entries?
I the SEO MOZ crawl diagnostics for my site, www.heartspm.com, I am getting over 100 duplicate content errors on links built from tags on blog entries. I do have the original base blog entry in my site map not referencing the tags. Similarly, I am getting almost 200 duplicate meta description errors in Google Webmaster Tools associated with links automatically generated from tags on my blog. I have more understanding that I could get these errors from my forum, since the forum entries are not in the sitemap, but the blog entries are there in the site map. I thought the tags were only there to help people search by category. I don't understand why every tag becomes its' own link. I can see how this falsely creates the impression of a lot of duplicate data. As seen in GWT: Pages with duplicate meta descriptions Pages [Customer concerns about the use of home water by pest control companies.](javascript:dropInfo('zip_0div', 'none', document.getElementById('zip_0zipimg'), 'none', null);)/category/job-site-requirements/tag/cost-of-water/tag/irrigation-usage/tag/save-water/tag/standard-industry-practice/tag/water-use 6 [Pest control operator draws analogy between Children's Day and the state of the pest control industr](javascript:dropInfo('zip_1div', 'none', document.getElementById('zip_1zipimg'), 'none', null);)/tag/children-in-modern-world/tag/children/tag/childrens-day/tag/conservation-medicine/tag/ecowise-certified/tag/estonia/tag/extermination-service/tag/exterminator/tag/green-thumb/tag/hearts-pest-management/tag/higher-certification/tag/higher-education/tag/tartu/tag/united-states
Technical SEO | | GerryWeitz0