Dealing with manual penalty...
-
I'm in the back-and-forth with Google's Quality Search team at the moment. We discovered a manual penalty on our website and have been trying to get it removed as of late. Problem is, tons of spammy incoming links.
We did not ask for or purchase any of these links, it just so happens that spammy websites are linking to our site. Regardless, I've done my best to remove quite a few links in the past week or so, responding to the Quality Search team with a spreadsheet of the links in question and the action taken on each link.
No luck so far.
I've heard that if I send an email to a website asking for a link removal, I should share that with Google as well. I may try that.
Some of the links are posted on websites with no contact info. A WhoIs search brings up a hidden registrant. Removing these links is far from easy.
My question is, what are some techniques that are proven to be effective when working your way through the removal of a manual penalty? I know Google isn't going to tell me all of the offending links (they've offered a few examples, we've had those removed - still penalized) so what's the best way for me to find them myself? And, when I have a link removed, it may stay in Webmaster Tools as an active link for a while even though it no longer exists. Does the Quality Search team use Webmaster Tools to check or do they use something else?
It's an open-ended question, really. Any help dealing with a manual penalty and what you have done to get that penalty removed is of great help to me. Thanks!
-
Ryan Kent has some experience with this, and shared it in this Q&A at http://www.seomoz.org/q/does-anyone-have-any-suggestions-on-removing-spammy-links
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with Expired & Reoccurring Content At Scale
Hello, I have a question concerning maintenance & pruning content with a large site that has a ton of pages that are either expired OR reoccurring. Firstly, there's ~ 12,000 pages on the site. They have large sections of the site that have individual landing pages for time-sensitive content, such as promotions and shows. They have TONS of shows every day, so the # of page to manage keeps exponentially increasing. Show URLs: I'm auditing the show URLs and looking at pages that have backlinks. With those, I am redirecting to the main show pages.
Technical SEO | | triveraseo
-However, there are significant # of show URLs that are from a few years ago (2012, 2013, 2014, 2015) that DON'T get traffic or have any backlinks (or ranking keywords). Can I delete these pages entirely from the site, or should I go through the process of 410-ing them (and then deleting? or ...?)Can you let 410's sit?)? They are in the XML sitemap right now, so they get crawled, but are essentially useless, and I want to cut off the dead weight, but I'm worried about deleting a large # of pages from the site at once. For show URLs that are still obsolete, but rank well in terms of kewyords and get some traffic...is there any recommended option? Should I bother adding them to a past shows archive section or not since they are bringing in a LITTLE traffic? Or ax them since it's such a small amount of traffic compared to what they get from the main pages. There are URLs that are orphaned and obsolete right now, but will reoccur. For instance, when an artist performs, they get their own landing page, they may acquire some backlinks and rank, but then that artist doesn't come back for a few months. The page just sits there, orphaned and in the XML sitemap. However, regardless of back-links/keywords, the page will come back eventually. Is there any recommended way to maintain this kind of situation? Again, there are a LOT of URLs in this same boat. Promotional URLs: I'm going about the same process for promotions and thankfully, the scale of hte issue is much less. However, same question as above...they have some promotional URLs, like NYE Special Menu landing pages or Lent-Specials, etc, for each of their restaurants. These pages are only valid for a short amount of time each year, and otherwise, are obsolete. I want to reuse the pages each year, though, but don't want them to just sit there in the XML sitemap. Is there ever an instance where I might want to 302 redirect them, and then remove the 302 for the short amount of time they are valid? I'm not AS concerned about the recycled promotional URLs. There are much fewer URLs in this category. However, as you can probably tell, this large site has this problem of reoccurring content throughout, and I'd like to get a plan in place to clean it up and then create rules to maintain. Promotional URLs that reoccur are smaller, so if they are orphaned, not the end of the world, but there are thousands of show URLs with this issue, so I really need to determine the best play here. Any help is MUCH appreciated!0 -
Manual Action found in WMTs, no email, no message in WMTs
Someone I know said that they were looking though there WMTs and under Manual Actions they found they had a partial penalty. There is no date against it and they never got an email and there are no messages WMTs for it. I haven't personally dealt with a Manual penalty before, but I would have expected there to be a message in WMTs for it ( an email might have been missed because of a spam filter etc). Could it be a very old penalty?
Technical SEO | | PaddyDisplays0 -
Google’s Latest Manual Action Penalty: Spammy Structured Markup
Anyone out there begin receiving this and or know when it started? Google has recently began sending a new manual action spam notification to webmasters for “spammy structured markup” also known as rich snippet spam. Your pal, Chenzo
Technical SEO | | Chenzo0 -
Manual Actions tab advice on message
Ok so I have this message in manual actions (with no examples of links): Manual Actions
Technical SEO | | pauledwards
Site-wide matches None
Partial matches Some manual actions apply to specific pages, sections, or links
Reason Affects
Unnatural links to your site—impacts links
Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. I am not surprised by this as an agency a few years ago did mass aritcle submissions for the same anchor text, I have manually removed 119 or so domains in the last year and a half and 4 weeks ago i disavowed the last 40ish domains left. Obviously the back-link profile can be seen to have an unnatural anchor-text distribution still but not as bad. In terms of rankings we lost some core terms on the homepage, not completely but most have gone from say page one to page 2/3/4 etc We are still getting good traffic to internal pages, so i am assuming action was taken to the homepage - where the mass of those links are pointing to. Where do you guys recommend I go from here, shall i go ahead and click the reconsideration request? or wait longer for the disavow. I am still also trying to remove bad links. Any advice much appreciated.0 -
Wordpress Redirect Plugin Vs Manual .htaccess?
Hi everyone, I need to 301 redirect my old pages to new ones but i am confused between whether to choose plugin for this or i should manually rewrite the code on .htaccess file. Please give your suggestion and if you think i should use plugin then which one?
Technical SEO | | himanshu3019890 -
URL Structure for Deal Aggregator
I have a website that aggregates deals from various daily deals site. I originally had all the deals on one page /deals, however I thought that maybe it might be more useful to have several pages e.g. /beautydeals or /hoteldeals. However if I give every section it's own page that means I have either no current deals on the main /deals page or I will have duplicate content. I'm wondering what might be the best approach here? A few of the options that come to mind are: 1. Return to having all the deals on one page /deals and linking internally to content within that page
Technical SEO | | andywozhere
2. Have both a main /deals page with all of the deals plus other pages such as /beautydeals, but add re="canonical" to point to the main /deals page
3. Create new content for the /deals page... however I think people will probably want to see at least some deals straight away, rather than having to click through to another page.
4. Display some sub-categories on the main /deals page, but have separate URLs for other more popular sub-categories e.g. /beautydeals (this is how it works at the moment) I should probably point out that the site also has other content such as events and a directory. Any suggestions on how best to approach this much appreciated! Cheers, Andy0 -
Client's site dropped completely for all keywords, but not brand name - not manual penalty... help!
We just picked up a new search client a few weeks ago. They've been a customer (we're an automotive dealer website provider) since October of 2011. Their content was very generic (came from the previous provider), so we did a quick once-over as soon as he signed up. Beefed up his page content, made it more unique and relevant... tweaked title tags... wrote meta descriptions (he had none). In just over a week, he went from ranking on page 4 or 5 for his terms to ranking on page 2 or 3. My team was working on getting his social media set up, set up his blog, started competitor research... And then this last weekend, something happened and he dropped completely from the rankings... He still shows up if you do a site: search, or if you search his exact business name, but for everything else, he's nowhere to be found. His URL is www.ohioautowarehouse.com, business name is "Ohio Auto Warehouse" We filed a reconsideration request on Monday, and just got a reply today that there was no manual penalty. They suggested we check our content, but we know we didn't do anything spammy or blackhat. We hadn't even fully optimized his site yet - we were just finishing up his competitor research and were planning on a full site optimization next week... so we're at a complete loss as to what happened. Also, he's not ranking for any of the vehicles in his inventory. Our vehicle pages always rank on page 1 or 2, depending on how big the city is... you can always search "year make model city" and see our customers' sites (whether they're doing SEO or not). This guy's cars aren't showing up... so we know something is going on... Any help would be a lifesaver. We've been doing this for quite some time now, and we've never had a site get penalized. Since the reconsideration request didn't help, we're not sure what to do...
Technical SEO | | Greg_Gifford0 -
How I can deal with ajax pagination?
Hello! I would like to have your input about how I can deal with a specific page in my website You can see my page here As you can see, we have a list of 76 ski resort, our pagination use ajax, wich mean we have only one url, and just below the list, we have a simple list of all the ski resort in this mountain, which show all the 76 ski resorts.. I know it's quite bad, since we can reach the same ski resort with two différents anchors links. Thanks you very much in advance, Simon
Technical SEO | | Alexandre_0