We haven been hit by penguin 2.0 what to do?
-
Hi,
Last week we got hit bij penguin 2.0. Our sites dropped on most keywords on average 10 places. We had a steady place for 2 to 3 years.
We have site-wide links in the top of our websites to the other websites ( about 9 e-commerce sites). Today i have put rel= "nofollow " tags in all these links (accept on the hompages). To prevent spammy links.
Is there anything else we can do ?
most important keyword = klokken ( previous position, 2nd place)
search engine = google netherlands
Thanks a lot for your help.
-
Try to keep anchor text links to 30-40% maximum if you want to steer clear of Penguin. The rest of your anchors should be things like your brand name, your URL, "click here", etc. Look at the link profiles of other sites to see what a more natural profile looks like.
Ultimately you want to avoid links where you have control over anchor text in the first place, and try to attract organic links to your site, which is ultimately what Google wants.
-
Thanks Takeshi,
I'm going to change the anchor text's, and remove some.
There is one thing is still have a question about, did we get a penalty from google or did we lose the value from the bad links. Is there a way to find this out ?
-
Takeshi,
How would you link for a particular keyword, i.e what should be the anchor text?Some of my sites have been penalized, i dont see much "spammy" links in our profile, i do see over optimized anchor text however.
-
The Penguin updates primarily target sites that have a lot of spammy external links with exact anchor text match.
So if your keyword is "klokken", and a large number of your external links (let's say greater than 40%) use the keyword "klokken" as the anchor text, then Google will think you have an unnatural link profile, and that you're just trying to game the search results.
The only thing to do with Penguin is to remove all the spammy/unnatural links coming to your site. For those links you can't remove, use the disavow tool in Google Webmaster Tools. Then file a reconsideration request, once all the links are removed.
Expect your site not to rank as well even if you recover from the penalty, since you've lost a lot of links. Then it's time to start building links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google only indexing the top 2/3 of my page?
HI, I have a page that is about 5000 lines of code total. I was having difficulty figuring out why the addition of a lot of targeted, quality content to the bottom of the pages was not helping with rankings. Then, when fetching as Google, I noticed that only about 3300 lines were getting indexed for some reason. So naturally, that content wasn't going to have any effect if Google in not seeing it. Has anyone seen this before? Thoughts on what may be happening? I'm not seeing any errors begin thrown by the page....and I'm not aware of a limit of lines of code Google will crawl. Pages load under 5 seconds so loading speed shouldn't be the issue. Thanks, Kevin
Intermediate & Advanced SEO | | yandl1 -
Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content
Hello, Can anyone help me find a solution to Fixing and Creating Magento CMS pages to only use one URL and not two URLS? www.domain.com/testpage www.domain.com/testpage/ I found a previous article that applies to my issue, which is using htaccess to redirect request for pages in magento 301 redirect to slash URL from the non-slash URL. I dont understand the syntax fully in htaccess , but I used this code below. This code below fixed the CMS page redirection but caused issues on other pages, like all my categories and products with this error: "This webpage has a redirect loop ERR_TOO_MANY_REDIRECTS" Assuming you're running at domain root. Change to working directory if needed. RewriteBase / # www check If you're running in a subdirectory, then you'll need to add that in to the redirected url (http://www.mydomain.com/subdirectory/$1 RewriteCond %{HTTP_HOST} !^www. [NC]
Intermediate & Advanced SEO | | iamgreenminded
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [R=301,L] Trailing slash check Don't fix direct file links RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ $1/ [L,R=301] Finally, forward everything to your front-controller (index.php) RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule .* index.php [QSA,L]0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
1 site on 2 domains (interesting situation, expert advice needed)
Dear all, i have read many posts about having one content on 2 different domains, how to combine those two to avoid duplicate content. However the story of my two domains makes this question really difficult. Domain 1: chillispot.org ( http://www.opensiteexplorer.org/links?site=chillispot.org ) The original site was on this domain, started 9 years ago. That time the owner of the domain was not me. The site was very popular with lots of links to it. Then after 5 years of operation, the site closed. I have managed to save the content to: Domain 2: chillispot.info ( http://www.opensiteexplorer.org/links?site=chillispot.info ) The content i put there was basically the same. Many links were changed to chillispot.info on external sites when they noticed the change. But lots of links are still unchanged and pointing to .ord domain. The .info is doing well in search engines (for example for keyword 'chillispot'). Now i managed to buy the original chillispot.org domain. As you can see the domain authority of the .org domain is still higher than the .info one and it has more valuable links. Question is: what would be the best approach to offer content on both domains without having penalized by google for duplicated content? Which domain should we keep the content on? The original .org one, which is still a better domain but not working for several years or the .info one who has the content for several years now and doing well on search engines? And then, after we decide this, what would be the best approach to send users to the real content? Thanks for the answers!
Intermediate & Advanced SEO | | Fudge0 -
Hit hard on Google last October.
Hi all, I'm just wondering if anyone is able to help me with why my website lost practically all its ranking last October (2012). My website is here: http://bit.ly/nAOfNj Since early 2010, we have been ranking in the top 3 for our keyword when searched all around the country. Between end September, and end October 2012, we started dropping (from 2nd to 8th, then in December, 13th, January, 18th place... and then March back up to 13th, now ~10th). The main problem seems to be that Google has changed how websites rank for our keyword (trampoline). In Brisbane, Australia (where we are based), we only rank in the local organic searches. We don't have a separate listing there anymore (with the meta description), even though we had a normal organic listing (and local listing) for the last 2 years! When searching from other states/suburbs further away, we dropped way off the first page. Our product is sold by resellers in 400 stores around Australia, so its not like we're just in Brisbane. Has anyone experienced Google changing how they return results for a specific keyword like this? Did they do it a lot towards the end of last year? We have a place page for Brisbane, but for some reason I have little to no control over it (Places/Local+ stuff up means I can't manage the page on Local+, can't add pictures/videos etc). My boss suggested we even try deleting the maps page or our local+ page to get out of there. We don't get anywhere near as much traffic through the local listing than a normal listing... I'm not sure if that's best though? From what I can tell, the only Google algorithm updates that may have affected us at the time (October 9th) were the page layout updates that penalised(?) sites that have a lot of "ads" above the fold. Our website is designed to have splash banners on the top of every page to either promote our own product, competitions or the athletes we sponsor. Up until last week, the banners were always 500px high on larger screen desktops and 300px high on smaller desktops, laptops, ipad etc. I have recently changed them to all be 300px high to test, but I imagine i'll have to wait a while? Is this the kind of content that Google means by "ads above the fold"? I've spent the last 4 weeks working on our SEO, from HTML validation, to rich snippets, content optimisation, a lot more internal linking, setting up some location-based content, doing a lot of keyword research, and now starting to work on cleaning up our Blog and creating some real sharable content that we'll share on our Facebook. I really just wish I knew where the problem was so I could tackle that 😞 Any advice would be GREATLY appreciated!!
Intermediate & Advanced SEO | | Vuly0 -
Is anyone noticing if penguin 2.0 as been launched
I read a paper that penguin 2.0 is already running As anyone more information about that or any tool to seen how rankings has been affected
Intermediate & Advanced SEO | | maestrosonrisas0 -
Hi my site dropped from page rank 4 to 0,
I have done everything the correct way and my site shouldnt break any guidelines can someone tell me where i can contact google and oppose this please? also can someone tell me if links play a part in this> of if this is something to do with the penguin update and my site has been wrongly effected? my url is below http://www.diamondwaste.co.uk/
Intermediate & Advanced SEO | | tommyboy70 -
How to move domain content w Penguin Penalty?
Hey guys, I've come to the conclusion the sheer amount of crap links a site of ours has is un repairable. We own a .net version with the same brand name so I'm planning to move our ecommerce store over with all its content. I can move the site in one swoop but I believe Google will see it as duplicate content if we don't allow the old site to de index first. I would simply take it down for a month but we still get some orders now and then. Anyone have any ideas? I was thinking of leaving an image up on each page that is no index no follow linked to the new site that explains the site is being moved, etc.
Intermediate & Advanced SEO | | iAnalyst.com1