Duplicate content when changing a site's URL due to algorithm penalty
-
Greetings
A client was hit by penguin 2.1, my guess is that this was due to linkbuilding using directories. Google webmaster tools has detected about 117 links to the site and they are all from directories. Furthermore, the anchor texts are a bit too "perfect" to be natural, so I guess this two factors have earned the client's site an algorithm penalty (no manual penalty warning has been received in GWT).
I have started to clean some of the backlinks, on Oct the 11th. Some of the webmasters I asked complied with my request to eliminate backlinks, some didn´t, I disavowed the links from the later.
I saw some improvements on mid october for the most important KW (see graph) but ever since then the rankings have been falling steadily.
I'm thinking about giving up on the domain name and just migrating the site to a new URL. So FINALLY MY QUESTION IS: if I migrate this 6-page site to a new URL, should I change the content completely ? I mean, if I just copy paste the content of the curent site into a new URL I will incur in dpolicate content, correct?.
Is there some of the content I can copy ? or should I just start from scratch?
Cheers
-
Hey Masoko -
In the past, I've had luck with 410ing the previous site and putting a link from it saying that we've moved. This way, you keep any direct traffic by referring them, but you also don't redirect your pages via 301.
Penalties pass through redirects. You don't want to keep both sites and duplicate content. I'd kill off the old site (it's only 6 pages, so that's pretty easy) and take the chance to, as has been said, refresh the content. Also, think about adding more pages to the site so you can rank for more longtail terms.
Good luck.
-
Thanks everyone for answering my question!!!
-
As long as you 410 (delete) the old pages, they are no longer indexed and will not cause a duplicate content issue.
-
You can safely move to a new domain, move the content over (upgrade it a little) and there should be no duplicate content issues. The duplicate content issues were designed for things like just scraping content from news feeds and posting them on your own site, and not having any unique or original. Or selling products as a reseller and not doing anything to the manufacturers text etc.
If you move the site to a new domain - I would just 410 the pages on the old site and not do any redirects. You were probably only ranking for a short period of time because of the unnatural back links. If you redirect them you will pass the negative link values over to the new site (those that were not fixed or disavowed anyway) and there is probably not much for good link metrics to warrant a redirect. You will lose any traffic from people who are trying to visit the old site, so maybe you can put up a message on the old site's homepage that it has moved to a new domain, but not link to it.
-
Masoko-T,
If you're sure that the penalty is from link building, you should have no problem. As mentioned above a refresh of the content, might be a good idea though.
-
Hi Tuzzel
Thanks for your reply. Are you sure there are no duplicate content risks?, I thought that, since google had already indexed the original content, finding the same content in a different (newer) site will cause the later to be considered "duplicate".
I hadn't thought about the 302 redirects, that's not a bad idea :).
-
If you're moving a site, Google's recommendations are to move the content and redirect. However, it sounds like you're looking for a fresh start.
Are you sure it's the links? Are you also concerned about EMD penalty or just hoping for a fresh start?
-
You should be ok just to replicate it, but by all means use the opportunity to refresh the content, 6 pages shouldn’t take too long. If you want to be extra safe then you can of course just rewrite from scratch. The Penalty will be at the domain level so you should be ok to redirect the existing pages to the New URLs, this will signal to Search engines that the pages have been moved and not to count the redirected pages as unique content, avoiding Dupe content issues. You can also use a cross domain Canonical tag.
If you don’t want to do any redirects to totally severe your links to the old domain profile then remove the original pages from Google’s index in your webmaster tools account and ensure you return 410 status codes to individuals that request the page. If you do still want the users to redirect however 302 the page to the new location as this won’t pass link equity.
Hope this proves useful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Penguin Penalty On A Duplicate url
Hi I have noticed a distinct drop in traffic to a page on my web site which occurred around April of last year. Doing some analysis of links pointing to this page, I found that most were sitewide and exact match commercial anchor text. I think the obvious conclusion from this is I got slapped by Penguin although I didn't receive a warning in Webmaster Tools. The page in question was ranking highly for our targeted terms and the url was structured like this: companyname.com/category/index.php The same page is still ranking for some of those terms, but it is the duplicate url: companyname.com/category/ The sitewide problem is associated with links going to the index.php page. There aren't too many links pointing to the non index.php page. My question is this - if we were to 301 redirect index.php to the non php page, would this be detrimental to the rankings we are getting today? ie would we simply redirect the penguin effect to the non php page? If anybody has come across a similar problem or has any advice, it would be greatly appreciated. Thanks
Intermediate & Advanced SEO | | sicseo0 -
Is a "Critical Acclaim" considered duplicate content on an eCommerce site?
I have noticed a lot of wine sites use "Critical Acclaims" on their product pages. These short descriptions made by industry experts are found on thousands of other sites. One example can be found on a Wine.com product page. Wine.com also provides USG through customer reviews on the page for original content. Are the "Critical Acclaim" descriptions considered duplicate content? Is there a way to use this content and it not be considered duplicate (i.e. link to the source)?
Intermediate & Advanced SEO | | mj7750 -
Duplicate content
Is there manual intervention required for a site that has been flagged for duplicate content to get back to its original rankings, once the duplicated content has been removed? Background: Our site recently experienced a significant drop in traffic around the time that a chunk of content from other sites (ie. duplicate) went live. While it was not an exact replica of the pages on other sites, there was quite a bit of overlap. That content has since been removed, but our traffic hasn't improved. What else can we do to improve our ranking?
Intermediate & Advanced SEO | | jamesti0 -
Rel canonical element for different URL's
Hello, We have a new client that has several sites with the exact same content. They do this for tracking purposes. We are facing political objections to combine and track differently. Basically, we have no choice but to deal with the situation given. We want to avoid duplicate content issues, and want to SEO only one of the sites. The other sites don't really matter for SEO (they have off-line campaigns pointing to them) we just want one of the sites to get all the credit for the content. My questions: 1. Can we use the rel canonical element on the irrelevent pages/URL's to point to the site we care about? I think I remember Matt Cutts saying this can't be done across URL's. Am I right or wrong? 2. If we can't, what options do I have (without making the client change their entire tracking strategy) to make the site we are SEO'ing the relevant content? Thanks a million! Todd
Intermediate & Advanced SEO | | GravitateOnline0