All Thin Content removed and duplicate content replaced. But still no success?
-
Good morning,
Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk.
Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS.
Can anyone tell me why we aren't making any progress or spot something we are not doing correctly?
Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500).
Look forward to your responses!
-
Thanks for your responses. We are talking over 3000 pages of duplicate content which we have no removed and replaced with actual relevant unique and engaging content.
We completed all the content changes on the 6/06/2013. Im thinking to leave it for a while and see whether our rank improves within the next month or so. We may consider moving the site to another domain since its features lots of high quality content.
Thoughts?
-
I've had two sites with Panda problems. One had two copies of hundreds of pages in both .html and .pdf format (to control printing format). The other had a few hundred pages of .edu press releases republished verbatim at their request or with their permission.
Both of these sites had site-wide drops on Panda dates.
We used rel=canonical on the .pdf documents on one site using .htaccess. On the site with the .edu press releases we used noindex/follow.
Both sites recovered to former rankings a few weeks after the changes were made.
If you had a genuine Panda problem and only a Panda problem then a couple months might be about the amount of time needed to see a recovery.
-
That's hard to say. A recent history and link profile like yours won't give your site the authority it needs for index updates at the frequency you would like. It's also possible that a hole has been dug that you cannot pop out of simply by reversing the actions of your past SEO.
You really need a thorough survey of your site, it's history, and it's analytics to determine the extent of the current problem and the best path to take to get out of it. Absent that, shed what bad back links that you can and develop a strategy to build visitor engagement with your brand.
-
The site has not received a manual penalty from Google.
However traffic and generic keywords fell when the previous developer decided to copy all of the products directly from our other site top4office.com.
The site was ranking pretty well in the past. Do you have any kind of ETA of when the updates will take effect
-
Hi Apogee
It can certainly take several months for your pages to drop from the index so if you've removed the pages in GWT and removed the URLs they'll eventually fall out of the index.
Was the site penalized and that's why you removed/replaced the dupe content--meaning were you ranking well and then, all of a sudden your rankings tumbled or are you just now working to build up your rankings? This is an important distinction because there are few examples of sites that received a panda penalty (thin/duplicate content) coming back to life.
If you don't think you've been penalized and you're just working to optimize your site and pull it up in the rankings for the first time, consider how unique your content is and how you're communicating your unique value proposition to the visitor. Keep focusing on those things.
Also, your back link profile looks a bit seedy--in fact, your problem could well be penguin-related. If you were penalized and it was a penguin penalty, you should be looking to clean up some of those links and working to build new ones from more thematically relevant sites.
-
Removing duplicate content won't necessarily increase your search positioning. It will however, give your site the foundations needed to start a (relevant, natural and organic) link building campaign - which if done correctly should increase your SERP's.
You should see content as part of the foundations. Good quality and unique content is usually needed in order to be rankable but it doesn't make you rank necessarily.
Having good quality unique content will also minimise the chances of being hit by an algo update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please provide solution for my website? Duplicate content Problem
I have 2 Domains with the same name with same content. How to solve that problem? Do I need to change the content from my main website. My Hosting is having different plans, but with the same features. So many pages were having the same content, and it is not possible to change the content, what is the solution for that? Please let me know how to solve that issue?
Intermediate & Advanced SEO | | Alexa.Hill0 -
Semi-duplicate content yet authoritative site
So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance
Intermediate & Advanced SEO | | Veebs0 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
How can I remove duplicate content & titles from my site?
Without knowing I created multiple URLs to the same page destinations on my website. My ranking is poor and I need to fix this problem quickly. My web host doesn't understand the problem!!! How can I use canonical tags? Can somebody help, please.
Intermediate & Advanced SEO | | ZoeAlexander0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0