Removing duplicated content using only the NOINDEX in large scale (80% of the website).
-
Hi everyone,
I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content.
However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user.
What do you think about this "theory"? What would you do?
Thank you for your help!
-
-
it has been almost a year now from the massive hit. after that, there were also some smaller hits
-
we are putting effort into improvements. that is quite frustrating for me, because I believe that our effort is demolished by old duplicated content (that creates 80% of the website :-))
Yeah, we will need to take care about the link-mess...
Thank you! -
-
Yeah, this strategy will be definitely part of the guidelines for the editors.
One last question: do you know some good resources I can use as an inspiration?
Thank you so much..
-
We deleted thousands of pages every few months.
Before deleting anything we identified valuable pages that continued to receive traffic from other websites or from search. These were often updated and kept on the site. Everything else was 301 redirected to the "news homepage" of the site. This was not a news site, it was a very active news section on an industry portal site.
You have set 410 for those pages and remove all internal links to them and google was ok with that?
Our goal was to avoid internal links to pages that were going to be deleted. Our internal "story recommendation" widgets would stop showing links to pages after a certain length of time. Our periodic purges were done after that length of time.
We never used hard coded links in stories to pages that were subject to being abandoned. Instead we simply linked to category pages where something relevant would always be found.
Develop a strategy for internal linking that will reduce site maintenance and focus all internal links to pages that are permanently maintained.
-
Yaikes! Will you guys still pay for it if it's removed? If so, then combining below comments with my thoughts - I'd delete it, since it's old and not time relevant.
-
Yeah, paying ... we actually pay for this content (earlier management decisions :-))
-
EGOL your insights are very appreciated :-)!
I agree with you. Makes total sense.
So you didn't experience any problems removing outdated content (or "content with no traffic value") from your website? You have set 410 for those pages and remove all internal links to them and google was ok with that?
Redirecting useless content - you mean set 301 to the most relevant page that is bringing traffic?
Thank you sir
-
But I still miss the point of paying for the content that is not accessible from SE
- "paying"?
Is my understanding right, that if I would set canonical for these duplicates, Google has no reason to show this pages in the SERP?
- correct
-
HI Dimitrii,
thank you very much for your opinion. The idea of canonical links is very interesting. We may try that in the "first" phase. But I still miss the point of paying for the content that is not accessible from SE.
Is my understanding right, that if I would set canonical for these duplicates, Google has no reason to show this pages in the SERP?
-
Just seeing the other responses. Agree with what EGOL mentions. A content audit would be even better to see if there was any value at all on those pages (GA traffic, links, etc). Odds are though that there was not any and you already killed all of it with the noindex tag in place.
-
Couple of things here.
-
If a second Panda update has not occurred since the changes that were made then you may not get credit for the noindexed content. I don't think this is "cheating" as with the noindex, it just told Google to take 350K of its pages out of the index. The noindex is one of the best ways to get your content out of Google's index.
-
If you have not spent time improving the non-syndicated content then you are missing the more important part and that is to improve the quality of the content that you have.
A side point to consider here, is your crawl budget. I am assuming that the site still internally links to these 350K pages and so users and bots will go to them and have to process etc. This is mostly a waste of time. As all of these pages are out of Google's index thanks to the noindex tag, why not take out all internal links to those pages (i.e. from sitemaps, paginated index pages, menus, internal content) so that you can have the user and Google focus on the quality content that is left over. I would then also 404/410 all those low quality pages as they are now out of Google's index and not linked internally. Why maintain the content?
-
-
Good point! News gotta be new
-
If there are 500,000 pages of "news" then a lot of that content is "history" instead of "news". Visitors are probably not consuming it. People are probably not searching for it. And actively visited pages on the site are probably not linking to it.
So, I would use analytics to determine if these "history" pages are being viewed, are pulling in much traffic, have very many links, and I would delete and redirect them if they are not important to the site any longer. This decision is best made at the page level.
For "unique content" pages that appear only on my site, I would assess them at regular intervals to determine which ones are pulling in traffic and which ones are not. Some sites place news in folders according to their publication dates and that facilitates inspecting old content for its continued value. These pages can then be abandoned and redirected once their content is stale and not being consumed. Again, this can best be done at the page level.
I used to manage a news section and every few months we would assess, delete and redirect, to keep the weight of the site as low as possible for maximum competitiveness.
-
Hi there.
NOINDEX !== no crawling. and surely it doesn't equal NOFOLLOW. what you probably should be looking at is canonical links.
My understanding is (and i can be completely wrong) that when you get hit by Panda for duplicate content and then try to recover, Google checks your website for the same duplicate content - it's still crawlable, all the links are still "followable", it's still scraped content, you aren't telling crawlers that you took it from somewhere else (by canonicalizing), it's just not displayed in SERPs. And yes, 80% of content being noindex probably doesn't help either.
So, I think that what you need to do is either remove that duplicate content whatsoever, or use canonical links to originals or (bad idea, but would work) block all those links in robots.txt (at least this way those pages will become uncrawlable whatsoever). All this still is unreputable techniques though, kinda like polishing the dirt.
Hope this makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Common passwords used for spam accounts?
This is a bit of a longshot. I know that many of the spam forum accounts, blog posts etc that have in the past been used for SEO are generated automatically. Does anyone know of any common passwords that are often used when setting up these accounts? I only ask as, trying to clean up the backlink profile for my website, I found myself in desperation keying in random passwords trying to access the spam accounts created on various forums by our former SEO agency. Eventually I got lucky and worked out the password for a series of forum accounts was, not very imaginatively, 'seo'. Having worked out this, I was able to delete the spam signatures on about 10 forums. But there are many other accounts where I have no idea of the password used. I guess I'm just wondering if there are standard stock passwords used in the past by many SEOs? Not likely to get an answer to this one, I know, but worth a shot.
White Hat / Black Hat SEO | | mgane0 -
Lots of websites copied my original content from my own website, what should I do?
1. Should I ask them to remove and replace the content with their unique and original content? 2. Should I ask them to link to the URL where the original content is located? 3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
Is using Zeus's gateway feature to display contents from the different URL OK to do?
I've been writing a blog on free hosting blog platform and planning to migrate that under my domain name as directory. myblog.ABCD.com to www.mydomain.com/myblog now, I've learned that my Zeus server has a way to show myblog.ABCD.com at mydomain.com/myblog without transferring anything by using the Gateway feature. This will save a lot of time and hassle for me, but my question is if this is ok to do?
White Hat / Black Hat SEO | | HypermediaSystems
Is there a chance that this could be considered a blackhat even though the content is mine? From the Zeus documentation:
"Gateway aliases enable users to request files from the new
web server, and receive them as if they were on the new server, when they are
still located on the legacy server. To the user, the files appear to be located on
the new server. " Thank you.0 -
Website is not stable on maps.google.com
My website is daily showing different position on maps.google.com and for the last few days like yesterday it was on 21st position on some keyword and today it is no where and same with other keywords. Is this a Google Dance ?? what can be its period ? and what is the solution to handle it ??
White Hat / Black Hat SEO | | mnkpso0 -
Need some advise on using a micro site
I thought I would use a micro site with just some main product landing pages being used. I would use the same design and code as main site, then re-write the text and then link everything to the new site. “BUT” I'm concerned about getting a penalty (duplicate) as all the anchor text links going to the main site would be identical! EG. To use the same design as the main site I would need to use the same layout etc including navbars, anchor text links in the footer etc.. and I'm worried this may trigger a duplicate content penalty ? Any advise please
White Hat / Black Hat SEO | | doorguy880 -
When to give up on a website with a Google penalty?
I recently had a Google 60 penalty hit my website. The main two issues were that I had a person helping me with SEO and they bought some links. The second issue is that I own about 90 URL's in the my vertical. I created about 60 one page sites for these keyword targeted domains. I then linked these sites to main site. Big mistake! I kept these URL's all on the same server as my main site. In October 2010 I noticed my site hits dropped dramatically. I started looking for the issue. I didn't know which issue caused the penalty. I fixed both issues in November 2010 and asked Google for reconsideration in early December 2010. I kept link building for my site by finding quality links.I was extremely honest with Google. I gave them all of the domains I own and I told them the name of the person that bought links for me and the websites where those links were placed. As of late February 2011 a Google search for my domain still showed up in approximately the 64th position. I recently asked Google again to lift the penalty. I basically told them that I fixed all of my issues that led to the penalty and let them know I have been waiting for almost 3 months. I told them I have put the past 2 years of my life into this website and begged them to forgive me. I also asked them to let me know if my site was never going to be forgiven? I got the typical canned response from the Google team. As of today the penalty is still in effect. I just want to know when you should give up on a site. I have spent about $20,000 on this site and about 2 years of hard work. I don't want to give up, but I don't want to keep putting my hard work and time into the site if it will never escape the dreaded Google penalty. Do you think I should continue to wait and if so how long? Anything else I can do to persuade Google to release me from this penalty hell? If I do abandon the site and start from scratch what steps should I take? Do I need a new server? What if any content can I take from my current site and transfer to the new site? If I can how do I do this without getting another penalty or lose the credit for the original content. I created about 2,000 pages of original content for this site. I'd love to be able to transfer this content if I have to start from scratch. Any ideas or detailed help plans would be greatly appreciated.
White Hat / Black Hat SEO | | tadden0