Partial match penalty & Penguin 2.1 smack
-
Our site is large and allows business owners to post their inventory for sale. We also make websites for those businesses that post their inventory. We link back to the home page of our site from each of those business websites using our domain name as the anchor text.
Last summer we got a partial match penalty from Google
"Unnatural links to your site—impacts links
Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. "
We investigated and noticed a large amount of links from spammy sites, forum signatures, blog comments, etc. We think we were hit by a negative SEO campaign. We started cleaning up the backlinks and disavowing them. Every reconsideration request since has been denied with more examples of these horrid links.
The final reconsideration request gave as examples of how we're violating Google link quality guidelines, our own sites we make for businesses. "_Google has received a reconsideration request from a site owner for domainname.com. We've reviewed the links to your site and we still believe that some of them are outside our quality guidelines." _
So here's the issue I need your advice on. We have tens of thousands of business websites linking back to our main site using our domain name. We're assuming this is the reason Google gave them as examples for violating link quality guidelines. **How can we fix this without losing traffic from removing all those backlinks or make our traffic tank worse than it has? **
- Can we replace the domain name with our logo image and still link?
- Can we nofollow all those links?
- Can we link not to the home page but to internal pages or sections with no more than 10% of the links, linking to each section?
- Should we just remove the links and cry?
-
Personally I'd no-follow them, you don't want them to pass on any ranking factors so no-follow the links if they are driving traffic. If they don't drive traffic then get rid of them.
Moosa is right though, you need to use other tools aside from WMT because that doesn't show a complete list of links to your website.
-
We've been disavowing links found in G Webmaster Tools and Link Detox. It took my resource 4.5 months to do 80k links.
Our business websites used our domain name to link to us. As in 'powered by domainname.com'. I can only assume that is why Google showed those examples in the latest denied reconsideration request. Because we're linking from each site back to our site and they're all on the same C class IP.
Should we still link using our domain name, or remove those links?
-
Ok, I recently revoked the penalty after receiving three rejections from Google J. Here, I am going to share my experience with you and how I get over from the penalty!
- The links that are displayed in the Google Webmaster tools are not enough, you have to search for more data from Majestic SEO, Ahrefs, Moz and more.
- You can use Link Detox or Link Risk to see what links are risky for your website.
Try to remove as many links as possible and disavow rest of them.
As far as the anchor text are concern… always keep in mind that if you have lot of good websites that are linking back to you with a keyword in the anchor text, in that case you should use a strategy to diversify the anchor text or turn it in to the brand name.
In my opinion your % of brand name in the anchor text should be greater than the % of any other keyword in the anchor text (sign of a naturally looking website).
Plus at the same time plan a kick as content marketing plan that gives you natural links because this hints Google that now you are running at the right path!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
How can I 100% safe get some of my keywords ranking on second & third page?
Hi, I want to know how can I rank some of my keywords which are in the second and third page on google on page one 100% save, so it will pass all penguin, pandas etc as quick as possible? Kind Regards
White Hat / Black Hat SEO | | rodica70 -
Penalty removing company recommendation?
We've got a manual penalty, not sitewide, that we've been trying to remove and keep getting our reconsideration request denied. We also do not have the manpower to manually check backlinks, contact domain owners, etc anymore. Does anyone have recommendations on a company to use?
White Hat / Black Hat SEO | | CFSSEO0 -
Subtle On-site Factors That Could Cause a Penalty
It looks like we have the same penalties on more than one ecommerce site. What subtle on-site factors can contribute to non-manual penalty, specifically rankings slowly going down for all short tail keywords? And what does it take to pull yourself out of these penalties?
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How can I recover from an 'unnatrual' link penalty?
Hi I believe our site may have been penalised due to over optimised anchor text links. Our site is http://rollerbannerscheap.co.uk It seems we have been penalised for the key word 'Roller Banner' as the over optimised anchor text contains key word 'Roller Banner' or 'Roller Banners'. We dropped completely off page 1 for 'Roller Banner', how would I recover from this error?
White Hat / Black Hat SEO | | SO_UK0 -
Www.oru.edu & oru.edu
Everyplace I check, the pagerank for both of those domains is either 0 or NA. However on this site oru.edu actually has a pagerank and this is the only source I can find for that. The sub www.oru.edu still has a zero on this site. A historical checker has oru.edu a N/A ranking just a few days ago and in June.
White Hat / Black Hat SEO | | Oklahoma_0