How Many Words To Make Content 'unique?'
-
Hi All,
I'm currently working on creating a variety of new pages for my website.
These pages are based upon different keyword searches for cars, for example used BMW in London, Used BMW in Edinburgh and many many more similar kinds of variations. I'm writing some content for each page so that they're completely unique to each other (the cars displayed on each page will also be different so this would not be duplicated either).
My question is really, how much content do you think that I'll need on each page? or what is optimal? What would be the minimum you might need?
Thank for your help!
-
Great question, and great answers from some of the other commenters. I've struggled with this question myself in building landing pages. The 20% rule is a good one, and makes sense, especially as Google gets better at semantic search and "keywords" become a bit less important in favor of query meaning. In a perfect world (one where search engines could understand queries the way your friend would when you told him what you searched for), if you cannot come up with 20% of a landing page that is entirely unique to that page, it's not something you should be building a landing page for. In the world we operate in, it's a nice guideline. My method for long tail landing page creation is: figure out what the head keyword that this long tail landing page is most related to (if you are trying to reuse the same value prop), and just rewrite every sentence. You should alter your word choice, sentence structure, and page organization (it's a nice opportunity to test those things as well, a long tail page that does unexpectedly well may give you some insight into a better converting format). At this point, I add the unique content. For keywords that aren't different enough to have true unique content, I'll generally write a section summarizing a few of the others all together, or add a different customer testimonial. To the commenter who mentioned that you can create unique content to search engines, but humans would laugh - a landing page for long tail keywords really shouldn't be something a customer can get to without coming to it from an external referrer. The root domain shouldn't link out to both domain.com/landing-page-head-kw and domain.com/landing-page-long-tail-kw.
-
Good Morning.
I am going to come at this from a slightly different viewpoint. There is a difference between rewriting an article to suit your needs by adding/cutting/modifying an article to suit your website, and simply spinning an article.
I'm being slightly presumptuous simply for the sake of discussion and from personal experience cleaning up a website full of this sort of content. EGOL, a Samauri Mozzer said a long time ago on another SEO board far far away that one day search engines will rate websites on content alone, and nothing else. It seems like that statement is coming true.
The recent updates, and even dating as far back as updates like Hummingbird have all pointed toward the importance of relevant, powerful, new content. Google new EAT standards even supported that more; expertise, authoritative, and trustworthiness. In my opinion, Google is trying to emulate how a human would search for things, the days of tricking Google into thinking your website is something that it isn't are close to being over.
Right now I am cleaning up a website that has content that is different enough to satisfy Google (at least to the point of not getting manual actions), but similar enough that any person who reads it laughs. We were getting plenty of traffic, but people were leaving once the noticed the similarity in the content.
It's tough to truly advise tactics without looking at a website, and again I am not suggesting you are spinning articles, trying to pull the wool over Googles eyes. I merely bring up another point.
I have learned that getting to the top of Google really is only half the battle. You still have to convert the people once the get there. You have an opportunity here to not ONLY satisfy Google, but also convert customers. Writing unique content that not only meets the needs of Google, yet ALSO convinces someone to purchase a car, in that moment.... well that's a win win! I would suggest spending the extra time writing individual content. It will help in the long run. And in the event that Google gets even better at determining duplicate content somehow, you are protected!
If shortcuts were easy, they would just be the way. This may be faster, but in the long run, it probably won't help as much as spending the time to write them all out.
I need a new car....
-
Do you have dealerships in each of those locations?
Usually content written exclusively for the search engines and not for users is not the best type of content.
-
The easiest and quickest answer to there is there is no word count limit but I would suggest you to look in to your competitors, see how they are writing and what kind of content they are producing on this pages.
If mostly people are writing long content pieces then you probably have to go with more words but if they are writing short, you have a margin.
Plus you should use creativity in your content that convince potential customer to convert. Like use of images, infographics, testimonials and more will help to a greater extent.
Hope this helps!
-
Thank you for your response.
this is really helpful, so essentially 20% is the minimum, but more than this would help?
-
HI There,
Thank you for your response.
THe purpose of the text is not as such to sell the car to the user (in this instance), we do have text on each individual car about its perks, technical specs etc. This page is simply for displaying lists of cars, with the content only really needing to introduce the cars as to appease search engines.
So essentially the content is for the search engines benefit in the sense that it will differentiate it from other pages and is hopefully therefore more likely to get indexed and bring us traffic for the long tail keywords that are being targeted.
Lots of content might definitely overwhelm users so really im trying to find the right balance of uniqueness and quantity!
-
This totally depends on too many variables for me to say (I think anyway).
Personally, I wouldn't overwhelm your visitors with too much wordy text - they're interested in test driving your cars and possibly buying one, NOT reading a load of gumph on the cars. Either they like the cars or they don't, obviously you need to include the benefits and features of each vehicle, but really I wouldn't write huge volumes of text because that'll put people off.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a GTLD extension 'count' as part of the target keyword?
Hopefully someone can shed some light on this for me. Reading about GTLDs, I came across this quote from TSO Host: 'What we don’t know is whether an extension can double up as a keyword, which is picked up by Google and treated identically to the rest of a domain name. I.e. - would ‘bristolguitars.music’ have more ranking potential than ‘bristolguitars.com’ as ‘music’ is a relevant search word?' Source: https://www.tsohost.com/blog/how-do-new-gtlds-affect-seo Does anyone know if a GTLD extension does double up as a keyword? For example, if Nike buys 'Nike.shoes', does this double as the keyword 'Nike shoes', or is Google and other search engines just looking at the domain name _before _the GTLD extension? I'm looking at .photography for examples (not my niche) and seeing folks are having mixed results ranking for 'Keyword + Photography', so would be keen to hear your thoughts.
Technical SEO | | ecommercebc0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Creating unique SEO content for E-Commerce - worried about it being copied
Hi, So, we know we don't have the best content - so we are hiring writers to create unique content for each product. What happens if this is now copied by another website? What does Google see? Do they recognize us as the original content? Has anyone used DMCA.com ? is it worth it? thanks, Ben
Technical SEO | | bjs20100 -
Duplicate Content
SEOmoz is reporting duplicate content for 2000 of my pages. For example, these are reported as duplicate content: http://curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158
Technical SEO | | jplill
http://curatorseye.com/Name=âHolster-Atlasâ---Used-by-British-Officers-in-the-Revolution&Item=4158 The actual link on the site is http://www.curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158 Any insight on how to fix this? I'm not sure where the second version of the URL is coming from. Thanks,
Janet0 -
132 pages reported as having Duplicate Page Content but I'm not sure where to go to fix the problems?
I am seeing “Duplicate Page Content” coming up in our
Technical SEO | | danatanseo
reports on SEOMOZ.org Here’s an example: http://www.ccisolutions.com/StoreFront/product/williams-sound-ppa-r35-e http://www.ccisolutions.com/StoreFront/product/aphex-230-master-voice-channel-processor http://www.ccisolutions.com/StoreFront/product/AT-AE4100.prod These three pages are for completely unrelated products.
They are returning “200” status codes, but are being identified as having
duplicate page content. It appears these are all going to the home page, but it’s
an odd version of the home page because there’s no title. I would understand if these pages 301-redirected to the home page if they were obsolete products, but it's not a 301-redirect. The referring page is
listed as: http://www.ccisolutions.com/StoreFront/category/cd-duplicators None of the 3 links in question appear anywhere on that page. It's puzzling. We have 132 of these. Can anyone help me figure out
why this is happening and how best to fix it? Thanks!0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0 -
How do I combat content theft?
A new site popped up that has completely replicated a site own by my client. This site is literally a copycat, scraped all the content, and copied the design down to the colors. I've already reported the site to the hosting provider and filled a spam report on Google. I noticed that the author changed some of the text, and internal links so that they don't link to our site anymore. Some of these were missed. I'm also going to take a couple preventative actions like change stuff in .htaccess, but that doesn't help me now, just in case it happens again in the future. I'm wondering what else i can or should be doing?
Technical SEO | | flowsimple0