Lot of duplicate content and still traffic is increasing... how does it work?
-
Hello Mozzers,
I've a dilemma with a client's site I am working on that is make me questioning my SEO knowledge, or the way Google treat duplicate content. I'll explain now.
The situation is the following: organic traffic is constantly increasing since last September, in every section of the site (home page, categories and product pages) even though:
-
they have tons of duplicate content from same content in old and new URLs (which are in two different languages, even if the actual content on the page is in the same language in both of the URL versions)
-
indexation is completely left to Google decision (no robots file, no sitemap, no meta robots in code, no use of canonical, no redirect applied to any of the old URLs, etc)
-
a lot (really, a lot) of URLs with query parameters (which brings to more duplicated content) linked from the inner page of the site (and indexed in some case)
-
they have Analytics but don't use Webmaster Tools
Now... they expect me to help them increase even more the traffic they're getting, and I'll go first on "regular" onpage optimization, as their title, meta description and headers are not optimized at all according to the page content, but after that I was thinking on fixing the issues with indexation and content duplication, but I am worried I can "break the toy", as things are going well for them.
Should I be confident that fixing these issues will bring to even better results or do you think is better for me to focus on other kind of improvements?
Thanks for your help!
-
-
Thanks all for taking time and answer my question, have a nice day!
-
One thing which is often misunderstood is duplicate content, it does penalize you in few ways but google doesn't take that into account as a site-wide ranking factor.
In other words, if you have duplicate content, google will just refuse to index the duplicate pages, which is bad, but it's not going to penalize ranking of other not-duplicate pages because there's a a lot of duplicate pages.
Duplicate pages are bad because each duplicate page is a lost opportunity to add a page to the index, and it waste crawler budget, theoretically harming the frequency google refresh your pages ranking.
-
Hi Ryan,
first of all, thanks for finding time to answer my question. You may be right as:
-
the domain is 14 years old ("If I had to guess they're probably a pretty old site")
-
brand traffic increased after a Facebook page has been created and made popular ("increasing in traffic due to Brand strength triggers")
So, I guess what you say is probably right, Google is figuring out by itself the site structure and the parameters URLs. Still, duplication of content represent way over 50% of the overall site content and I am surprised that this apparently is not representing a big problem for them (I guess this is because is internally duplicated and not from external sources).
Anyway I wont touch this part for now, and as suggested try to focus on what helped them so far and push these elements a little bit more.
Thanks again for your help!
-
-
At the least, I'd add on Webmaster Tools as I've never seen negatives of doing that. Plus that will give you more insight into what's helping drive the growth. If I had to guess they're probably a pretty old site that is increasing in traffic due to Brand strength triggers being emphasized within Google.
Duplicate content, query parameters, and indexation issues might end up being not that big of a combined based on how many pages they have indexed of their total. Google is pretty good at figuring out a site's structure and parameters. Duplicate content is often not as severe an issue when it's all housed within one domain.
Mostly look into their strengths and why that's working so well. Why, exactly, is their organic traffic increasing so well? That's something that you want to help even further. Play to their strengths.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shall we add engaging and useful FAQ content in all our pages or rather not because of duplication and reduction of unique content?
We are considering to add at the end of alll our 1500 product pages answers to the 9 most frequently asked questions. These questions and answers will be 90% identical for all our products and personalizing them more is not an option and not so necessary since most questions are related to the process of reserving the product. We are convinced this will increase engagement of users with the page, time on page and it will be genuinely useful for the visitor as most visitors will not visit the seperate FAQ page. Also it will add more related keywords/topics to the page.
Intermediate & Advanced SEO | | lcourse
On the downside it will reduce the percentage of unique content per page and adds duplication. Any thoughts about wether in terms of google rankings we should go ahead and benefits in form of engagement may outweight downside of duplication of content?0 -
May Faceted Navigation via ajax #parameter cause duplicated content issues?
We are going to implement a faceted navigation for an ecommerce site of about 1000 products.
Intermediate & Advanced SEO | | lcourse
Faceted navigation is implemented via ajax/javascript which adds to the URL a large number of #parameters.
Faceted pages are canonicalizing to page without any parameters. We do not want google to index any of the faceted pages at this point. Will google include pages with #parameters in their index?
Can I tell google somehow to ignore #parameters and not to index them?
Could this setup cause any SEO problems for us in terms of crawl bandwidth and or link equity?0 -
Duplicate content based on filters
Hi Community, There have probably been a few answers to this and I have more or less made up my mind about it but would like to pose the question or as that you post a link to the correct article for this please. I have a travel site with multiple accommodations (for example), obviously there are many filter to try find exactly what you want, youcan sort by region, city, rating, price, type of accommodation (hotel, guest house, etc.). This all leads to one invevitable conclusion, many of the results would be the same. My question is how would you handle this? Via a rel canonical to the main categories (such as region or town) thus making it the successor, or no follow all the sub-category pages, thereby not allowing any search to reach deeper in. Thanks for the time and effort.
Intermediate & Advanced SEO | | ProsperoDigital0 -
What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish... When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other - We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content? Thanks!
Intermediate & Advanced SEO | | bjs20101 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Concerns about duplicate content issues with australian and us version of website
My company has an ecommerce website that's been online for about 5 years. The url is www.betterbraces.com. We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website. The only major difference between the two websites is the price that is charged for the products. The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues. However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French
Intermediate & Advanced SEO | | djo-2836690