We haven been hit by penguin 2.0 what to do?
-
Hi,
Last week we got hit bij penguin 2.0. Our sites dropped on most keywords on average 10 places. We had a steady place for 2 to 3 years.
We have site-wide links in the top of our websites to the other websites ( about 9 e-commerce sites). Today i have put rel= "nofollow " tags in all these links (accept on the hompages). To prevent spammy links.
Is there anything else we can do ?
most important keyword = klokken ( previous position, 2nd place)
search engine = google netherlands
Thanks a lot for your help.
-
Try to keep anchor text links to 30-40% maximum if you want to steer clear of Penguin. The rest of your anchors should be things like your brand name, your URL, "click here", etc. Look at the link profiles of other sites to see what a more natural profile looks like.
Ultimately you want to avoid links where you have control over anchor text in the first place, and try to attract organic links to your site, which is ultimately what Google wants.
-
Thanks Takeshi,
I'm going to change the anchor text's, and remove some.
There is one thing is still have a question about, did we get a penalty from google or did we lose the value from the bad links. Is there a way to find this out ?
-
Takeshi,
How would you link for a particular keyword, i.e what should be the anchor text?Some of my sites have been penalized, i dont see much "spammy" links in our profile, i do see over optimized anchor text however.
-
The Penguin updates primarily target sites that have a lot of spammy external links with exact anchor text match.
So if your keyword is "klokken", and a large number of your external links (let's say greater than 40%) use the keyword "klokken" as the anchor text, then Google will think you have an unnatural link profile, and that you're just trying to game the search results.
The only thing to do with Penguin is to remove all the spammy/unnatural links coming to your site. For those links you can't remove, use the disavow tool in Google Webmaster Tools. Then file a reconsideration request, once all the links are removed.
Expect your site not to rank as well even if you recover from the penalty, since you've lost a lot of links. Then it's time to start building links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why rankings dropped from 2 page to 8th page and no penalization?
Dear Sirs, a client of mine for more than 7 years used to have his home page (www.egrecia.es) between 1st and 2nd page in the Google Serps and suddenly went down to 8 page. The keyword in question is "Viajes a Grecia". It has a good link profile as we have built links in good newspapers from Spain, and according to Moz it has a 99% on-page optimization for that keyword, why why why ??? What could I do to solve this? PD: It has more than 20 other keywords in 1st position, so why this one went so far down? Thank you in advance !
Intermediate & Advanced SEO | | Tintanus0 -
Google only indexing the top 2/3 of my page?
HI, I have a page that is about 5000 lines of code total. I was having difficulty figuring out why the addition of a lot of targeted, quality content to the bottom of the pages was not helping with rankings. Then, when fetching as Google, I noticed that only about 3300 lines were getting indexed for some reason. So naturally, that content wasn't going to have any effect if Google in not seeing it. Has anyone seen this before? Thoughts on what may be happening? I'm not seeing any errors begin thrown by the page....and I'm not aware of a limit of lines of code Google will crawl. Pages load under 5 seconds so loading speed shouldn't be the issue. Thanks, Kevin
Intermediate & Advanced SEO | | yandl1 -
Ranking 2 pages on the same domain in the same SERP
I thought it was generally said that Google will favour 1 page per domain for a particular SERP, but I have seen examples where that is not the case (i.e. Same domain is ranking 2 different pages on the 1st page of the SERPs...) Are there any "tricks" to taking up 2 first page SERP positions, or am I mistaken that this doesn't always happen?
Intermediate & Advanced SEO | | Ullamalm0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
2-websites focused on different markets but similar content
Hi all! I have a client who wants to branch out to another market (currently in Northern California and wants to open an office in Southern California), what would happen if we put up a second website that has similar content, but is exclusively for Southern California, with a different office address, and all the content geared towards Southern California market? There would be NO linking between the sites. Would that generate a penalty? Thanks! BB
Intermediate & Advanced SEO | | BBuck0 -
Community question- Penguin 2.0 link types?
What type of links do you think Penguin 2.0 targeted most - anchor text abuse , directory links, paid links, low quality guest posts, article directories etc????
Intermediate & Advanced SEO | | DavidKonigsberg0 -
Recent Penguin Update
Hi SEOMoz, Today www.carrentalbuddy.com.au was hit pretty big by the Penguin 2.0 update (I believe). We had some pretty strong rankings for multiple search terms and we believe we have done everything by the book for Google. We can't seem to figure out why our rankings have dropped so dramatically recently and was hoping that some SEOMoz's could take a quick look to help us fix this problem. Kindest Regards, Chris
Intermediate & Advanced SEO | | kymodo0 -
Penguin/Panda/Domain Purchase
If I move forward with the acquisition: 1. Should I, if there is a way, just acquire the domain and then attempt to unlink existing links? 2. Can I just buy the domain, completely kill the site, and then build again from scratch? Even if I do that, the links to the domain will still be out there. 3. Should I even move forward with the purchase if I know these tactics have been used? Thanks!
Intermediate & Advanced SEO | | dbuckles0