When to re-submit for reconsideration?
-
Hi! We received a manual penalty notice. We had an SEO company a couple of years ago build some links for us on blogs. Currently we have only about 95 of these links which are pretty easily identifiable by the anchor text used and the blogs or directories they originate from.
So far, we have seen about 35 of those removed and have made 2 contacts to each one via removeem.com.
So, how many contacts do you think need to be made before submitting a reconsideration request? Is 2 enough?
Also, should we use the disavow tool on these remaining 65 links? Every one of the remaining links is from either a filipino blog page or a random article directory.
Finally, do you think we are still getting juice from these links? i.e. if we do remove or disavow these anchor text links are we actually going to see a negative impact?
Thanks for your help and answers!!
Craig
-
sure or e-mail me davec@evolvecreativegroup.com
-
Hi Dave,
Would you mind if I PM you a few of these examples?
Thanks!
Craig
-
I can't answer that accurately without knowing A) What page on the sites you linked link back to your site site. B) What your website is about (theme, category etc.).
-
Thanks Dave. Would you agree that the links above are the types of sites we need to be removing and shouldn't hurt us if we remove them or disavow them?
Here are a few more examples:
http://linkssolutions.org
http://alcoosite.org
http://dbindex.info
http://xyzdirectory.info
http://topdirlisting.com
http://freearticlesinc.com
http://seenation.com
http://articlerich.com
http://ipunjab.comThe blog posts I know need to go. It is the article and directory sites that I am a little unsure of.
Thanks for taking the time to answer.
Craig
-
You're lucky you got a manual penalty and not an algorithmic one. When you get a manual penalty you get to use the disavow tool, say your sorry, and come back. Don't hesitate to use the disavow tool since you got the manual letter.
-
Yikes on the 5000 bad links! Yes, we are lucky. However, I am a little concerned that Google thinks we have more bad links than we do and is considering our organic links as paid or something. See (http://www.seomoz.org/q/to-remove-or-not-to-remove)
Here I will give you a few examples of the sites that are linking to us and can say that pretty much all of the sites that we have set apart as bad links are similar to these. These are the only links that were paid via this SEO company. These seem like obvious sites for the disavow tool, but just want to be sure. I am heard so many cautionary comments on the disavow tool, that I wonder if we should use it at all.
http://www.sackthetickettax.com/
http://www.cowboysandangels.info/
http://www.businessdesmoines.com/
http://www.vespertinecrawl.com/
If any of these seem like sites we would want to keep links from, please let me know. Or, if they all seem like links we would definitely want to disavow, let me know as well.
Thanks for your help and quick answer!
Craig
-
First, I think you're lucky that you only have 60 links outstanding, we recently took a client that had over 5000 bad links!!
If you have emailed or tried to contact the remaining websites then I'd say its time to turn to the disavow tool. Be really careful though when using it because you have to be 100% sure that these are bad links before disavowing them. Once you have submitted the disavow file (which can take a week or 2 to filter through to Google) you will get a message in Webmaster Tools to tell you they have received it.
I'd say at this point, resubmit for reconsideration. If you have evidence that you tried to contact these websites then include that in your request because Google wants to see that you have made an effort to get them removed first.
If the links are as bad as you say then they are probably doing more bad than good so don't worry about the link juice they are passing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
How can I make it so that robots.txt is not ignored due to a URL re-direct?
Recently a site moved from blog.site.com to site.com/blog with an instruction like this one: /etc/httpd/conf.d/site_com.conf:94: ProxyPass /blog http://blog.site.com
Technical SEO | | rodelmo4
/etc/httpd/conf.d/site_com.conf:95: ProxyPassReverse /blog http://blog.site.com It's a Wordpress.org blog that was set as a subdomain, and now is being redirected to look like a directory. That said, the robots.txt file seems to be ignored by Google bot. There is a Disallow: /tag/ on that file to avoid "duplicate content" on the site. I have tried this before with other Wordpress subdomains and works like a charm, except for this time, in which the blog is rendered as a subdirectory. Any ideas why? Thanks!0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
What do you think of this reconsideration request?
Just about to send a reconsideration request to Google for my site: seoco.co.uk and would like your input. I was going to include information about each URL I found and the steps I have taken but there is not room. What do you think of this: “Hi guys, i got an unnatural links message from you back in February and since then my website rankings have fallen dramatically. I spoke to someone at SEOmoz and they said that my website probably got penalised for directory links so I have gone out and tried to get rid of all the low quality ones that I am responsible for and some that I am not. Altogether I was able to identify about 218 low quality directory links. I attempted to contact every one of the directory owners twice over a two week period and I was able to get about 68 removed. I have used the disavow tool to devalue the rest. Trying to get rid of all of those bad links was hard work and I have definitely learned my lesson. Rest assured I will not be submitting to anymore directories in the future. Please can you give me another chance? If my site still violates the guidelines please could you point out some of the bad links that are still there?” What do you think? Can you think of anything else I should say? Dave
Technical SEO | | Eavesy0 -
I know I'm missing pages with my page level 301 re-directs. What can I do?
I am implementing page level re-directs for a large site but I know that I will inevitably miss some pages. Is there an additional safety net root level re-direct that I can use to catch these pages and send them to the homepage?
Technical SEO | | VMLYRDiscoverability0 -
Site Recovered from hack, should I submit a reinclusion request?
Hello, The site i'm referring to is http://www.pokeronamac.com, it was hacked via something called the "WordPress Pharma Hack" http://theblawblog.wordpress.com/2012/06/21/restoring-a-pharma-hacked-wordpress-site-wp-3-4/ We restored it as far as I can tell, but if anyone can confirm this by doing a site search and not getting redirected it would be appreciated. You will see that some search results still show up as spam, but when I click on them, they 404. I want to know If I should submit a reinclusion request, I wasn't notified by WMT of malaware, so I want to know the SOP here. Thanks Zach
Technical SEO | | Zachary_Russell1 -
How often should we re-submit the site map
Hello, my question is how often should we need to re- submit our site map in google webmaster tools? like we are using prestashop and we keep on adding new products to our site. is we have a plugin to generate the site map from our backend. is it necesary to login in to google webmaster tools every day and re submit our sitemap to google?
Technical SEO | | idreams1 -
We're working on a site that is a beer company. Because it is required to have an age verification page, how should we best redirect the bots (useragents) to the actual homepage (thus skipping ahead of the age verification without allowing all browsers)?
This question is about useragents and alcohol sites that have an age verification screen upon landing on the site.
Technical SEO | | OveritMedia0