Google Penalty
-
Hello thereIt has just been confirmed that my site is suffering for a Google Penality. The odd thing is that it is just for my set of key keywords and not the whole site.
It seems in 2010 that a rouge SEO company I used sent loads of spammy links but I have lots of other good links too.
What is your advice moving forward? Will it drop over time? Do I set up a new site (this has been established over 10 years) or try and get better quality links to compensate for the spam ones?
Thanks so much! Kindest regards
Victoria
-
I agree with Gary, you should definitely go through all your links and be really ruthless in terms of which are the poor quality links. Anything that is on an article or directory website you can probably get rid of (apart from directories like Yahoo, DMOZ etc) and submit your disavow file to Google.
If it is only your target keywords that have dropped then you have been lucky to escape a site-wide penalty but there is a good chance that this will be on its way so if you can clean up your profile sooner rather than later, it should help.
-
Do you have a Manual Penalty? How have you confirmed you have the penalty? Is there an option to send a reconsideration request in Web Master Tools?
Removing those links is always the best thing to do. However I would start by putting them all in a disavow file first so that Google can start to ignore them as dofollow links. Meanwhile you can start to remove them.
However if you have a Manual penalty, you will have a lot more work to do. Let me know and I will guide you.
-
You can remove the "spammy" links but make sure they are in fact spammy as removing all low end links can also hurt your rankings and may not have been doing you any hard in the first place. You can also as you mentioned build some good links and help counteract it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
01 November 2013 - A possible Google's update??
Hello Guys! On 01 November 2013, one of our website's traffic has been dropped by more than 70%. We had added plenty of domains to the disavow tool, I am just wondering if its a Google's update or the disavow tool has destroyed the site? Any ideas? 9p3sN3p.jpg
Link Building | | TheSEOGuy10 -
Google Disavow File Update
Is there any specific format to update the Disavow file? Also if I submitted the file a months ago, and need to update it now... should I leave the old 'excluded domains' or should I remove them? Lets say this is what I have: How would you update it? #explanation from to Google went here... and ended here.
Link Building | | dhidalgo1
"domain:exampledomainalreadysubmitted1.com"
"domain:exampledomainalreadysubmitted2.com"
"domain:exampledomainalreadysubmitted3.com" Thanks for your input0 -
Did Google Panda and Penguins have any effect on Outbound Links?
I am just a beginner in SEO. But has any of the changes recently implemented by google affected out bound links? Meaning if a website now has a good number of outbound links to strong websites and pages could that Translate to better rankings?
Link Building | | sherohass0 -
Link building / baiting in the Google zoo
I work for a consultancy, and in the past most of our links have been acquired by giving away privacy statements etc for websites, including a link back in the body of the document, and making it a licensing requirement that the link be kept. We're launchinga new site. We want this one to be whiter-than-white, and would appreciate some advice on the following options. Option 1: no links Remove the links from the documents, and don't require links for the use of the documents. Leave a non-linking credit in the documents. Perhaps ask nicely for links from other pages. Option 2: links on other pages Remove the links from the documents, but make it a licensing requirement that users will link to our site from another page on their site. I appreciate that most won't, but some will. Option 3: retain the links Keep the links in the document, using domain name (with and without http and www) and business name anchor text. Option 4: script the links Use scripts to generate randomized links in the documents, so that no two are the same, but with relevant linking text for the most part. We're risk-adverse with the new site, and it will pick up some links "naturally". We're therefore tending toward option 1, on the basis that it may well generate as many links as option 2. Which of these options would you choose? Are there any other options we should be considering?
Link Building | | seqal0 -
Why is Google not following a 301 redirect on the robots.txt file?
Hi Guys, I recently posted a question on the Google Webmasters Forum http://www.google.com/support/forum/p/Webmasters/thread?tid=683e71557db7fd54&hl=en&fid=683e71557db7fd540004a4b4add8cbb6 and didn't get a satisfactory feedback so I thought I will put this to the SEO gurus on here. Perhaps one of you guys might good buddies with Matt and might be able to ask him directly. I actually posted on Matt's blog but he hasn't got back to me. Basically we did a URL restructure for client and set up 301 redirects and saw a huge drop in rankings over time. The 301 redirects seem to work fine and have been tested by many many people. We suspected that google might be ignoring the 301 redirects or devaluing them and so I reviewed the server log to see what is happening when the googlebot crawls the site and it showed that on many occasions the googlebot did not reload the page after hitting a 301 redirect. Sure.. you might say it probably queues it or Google might just be checking that the 301 redirect is still in place but why check so often (with a few hours to a day on the same URL) it even skips a 301 redirect on the robots.txt file i.e. from http://clientsite.com/robots.txt to http://www.clientsite.com/robots.txt. from non-www to www version. I don't think it is easy to dismiss the skipping of the robots.txt file - this 301 redirect should be loaded immediately to use the instructions the gooblebot requires to crawl the page. Any help will be appreciated. I can sent the server log to anyone personally but I am reluctant to post it on here. Regards, Zan
Link Building | | FRL0 -
Do "Sponsored Posts" links get discounted by google?
Do "Sponsored Posts" links get discounted by google? A feture post that is tagged by the site as "featured' or "sponsored" are they discounted as paid links by google?
Link Building | | DavidKonigsberg0 -
Does google extract out keywords from links that use the url as the anchortext?
If someone formats a link like so: http://www.somedomain.com/keyword1-keyword2/keyword3.php Would you still get a similar anchor text benefit to this link as if it was formated like so: somedomain keyword1 keyword2 keyword3 what i mean is, does google extract out keywords from links that use the url as the anchortext?
Link Building | | adriandg0 -
Linking to Google Place page??
I'm wondering if anybody has tried and had success in improving search rankings by creating links to Google Place pages. What say you?
Link Building | | bdiddy0