Partial Match Penalty Site - Move Portion & Redirect To New Site
-
So I have a site that currently has a partial match penalty from google, I have been working to get it removed...Bad SEO basically my site was submitted to a bunch of bad blog networks..Hopefully it gets lifted soon as we remove and disavow links.
That said I was planning on moving a portion of my site to a new site since its not really the focus of the site anymore however still pays the bills. I have also have been building it more of a network of sites..So If I do that and 301 redirect the pages I moved, will the penalty carry? On the current site I planned on using Rel no follow to any links that I may change in the header/menus etc..
Some of these pages I believe have the penalty while others dont. I really just dont want to screw anything else up more then it is?
My biggest fear is that its perceived as a blackhat method or something like that? Any thoughts?
-
As long as you don't 301 penalized pages / websites to a non-penalized site, you shouldn't have anything to worry about.
However, if you are sure your current site has pages penalized instead of a link penalty (which means that incoming links are not being counted, not that your site is penalized), then you shouldn't 301 anything if you are planning to "start over".
There's been reports from people with penalized sites which moved to another domain and they were able to recover. I would only recommend such a move if you are now working only with your previous customers (existing before the penalty). But if you are still gaining search traffic and customers, then you should consider fixing the issues instead of moving.
Just my 2 cents
-
Be careful with partial moves. I have got stung before by doing it incorrectly.
Essentially what happens is that all the internal links that link to the select pages you move will count as external links for the brand new pages. This could create a large number of backlinks from menu's footer, sidebar, menu links or any other links and give you a penalty on the new site on the next penguin refresh.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Permanently Moving Few High Ranking Pages from One Domain to Another
We are planning to move few high ranking pages permanently (301 Permanent Redirection) to another domain, Currently these pages are getting good traffic from organic search and ranking on top positions in Google search engine result pages. We have few questions in our mind right now, It would be a great help if anyone can answer following questions; Is it possible to move few pages from one domain to another by using 301 Redirection in .htaccess file? Will it have any negative impact on our website's current search engine performance? Will it be considered as a legitimate SEO practice by Google Search Engine? Will Google understand that these pages have been moved permanently to another domain and start showing URL's from the new domain on the same positions where they were ranking before moving to new location?
White Hat / Black Hat SEO | | tigersohelll0 -
Why don't I outrank this site?
Hi Mozzers, I'm mystified. Why doesn't our site www.bosphorusyacht.com (ranked 15) outrank this site www.bosphorustour.com (ranked 5 and 6) for the keyword "bosphorus cruise"? Particularly for US based searches. We have far more links, shares, higher DA and PA and more related unique content on topic. Somehow they are even appearing with double listings in this search. Why is this? Am I missing something? Any ideas or suggestions appreciated.
White Hat / Black Hat SEO | | emerald0 -
Re-Post: Unanswered - Loss of rankings due to hack. No manual penalty. Please advise.
Sorry for reposting, but i must have accidentally marked this as answered. I am still seeking advice/solutions. I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
White Hat / Black Hat SEO | | digitalimpulse0 -
How can I recover from an 'unnatrual' link penalty?
Hi I believe our site may have been penalised due to over optimised anchor text links. Our site is http://rollerbannerscheap.co.uk It seems we have been penalised for the key word 'Roller Banner' as the over optimised anchor text contains key word 'Roller Banner' or 'Roller Banners'. We dropped completely off page 1 for 'Roller Banner', how would I recover from this error?
White Hat / Black Hat SEO | | SO_UK0 -
My Site having a drop in traffic with eash passing month
Hi, I'm running a text message site mixsms.com from 2009. It was performing good till Oct 2012. In the end of Nov 2012 I noticed a drop in traffic and then with each passing month I'm getting 1500-2000 unique visitors drop. In Oct 2012 my daily unique visitors were 15000+ each day and now it is just having 2000 after Feb end. I've done several things to improve my site. I changes the template, removed all unnecessary html elements, changed seo structure (optimize with all modern seo techniques). Stop backlinking from Nov 2012 but instead of getting improvements I'm continuously having a drop in traffic. I'll highly appreciate your time if you look into site deeply to findout exact issues that are causing for this drop. I'm even ready to hire any seo consultant if he is pretty sure to get 100% results. Thanks in advance for your support
White Hat / Black Hat SEO | | intelmixx0 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0 -
Has anyone been able to recover a site from that was slapped by panda?
I have a client that the only thing I can determine is over optimization of a couple anchor terms which the person no longer ranks for..I tried mixing up with brandname , brandname.com and a diversity of links but nothing seems to budge anyone have a similar problem?
White Hat / Black Hat SEO | | foreignhaus0