Google disavow tool
-
I have an algorithmic penalty on one of my websites. I never received a notification of a manual penalty in GWMT and even sent in a reconsideration request 6 months ago ad they told me their were no manual penalties on the website.
I have cleaned up my link profile and what I could not clean up I sent in using the Google disavow tool a few days ago. I've heard to just wait if it's algorithmic or should I send in another reconsideration request for disavow links tool?
-
Since you have already cleaned things up and disavowed the ones you couldn't, I would not recommend sending a reconsideration request. Instead, I would start doing marketing as usual. Start getting some high quality, optimized, fresh content on your site and proceed with your solid white hat SEO method of operation. Be patient.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to leverage Google Images?
My Google search rankings are improving rapidly at the moment, but a lot of my rankings are for images (presume that means the images are appearing near the top in Google Images). How do I capitalise on that? It's not really much help to me that my images are popular unless it results in traffic to the pages where those images are used. I am running Wordpress so I have the option to have images embed as "no link", "link to attachment page", "link to original image", etc. Is there any advantage of using one of these over the other? I'd really like to set it up so that when a Google Images user clicks "View Image" it loads the attachment page or the host content page rather than the image. Bad SEO? I'm not sure if the fact that I'm using Jetpack Photon CDN image hosting will make this more complicated or not. Tony
Intermediate & Advanced SEO | | Gavin.Atkinson0 -
Not ranking in Google - why???
This will be a bit long, so please bare with me. I have a client in the auto parts industry who wants to rank their homepage for 13 different keywords. We are ranked first page for all keywords in Yahoo! Mexico and Bing Mexico, but not ranking first page at all in Google Mexico. My client's competitor, however, is clearly outranking my client in Google. When comparing both pages, my client's, while not 100% optimized, looks better optimized than their competitor's. Looking at all metrics using Moz, SEMRush, ahrefs, etc... my client's site looks MUCH better on all fronts. I know ranking a single homepage for more than 10 keywords is a difficult task. Our competitor is however, ranking for them, so it's not impossible. The keywords are not even that competitive according to Moz's analysis. I decided to create an optimized page for each keyword to try to rank these pages, but still my client wants the homepage to rank (again, if the competitor is ranking, then it's possible to do this) and I am afraid these pages I created could result in keyword cannibalization ultimately affecting the homepage's possibility to rank. My client had a previous SEO agency working for them and basically all they did was create fake blogs and have lots of keyword rich links directed to the site's homepage. I got the complete link profile from several tools and submitted a disavow requests for as many fishy links I could find, but that hasn't shown any results so far. Note: when looking at the competitor link profile, they have basically just a few links and no external links of real value whatsoever. My client is obviously very frustrated, and so am I. In my SEO experience, it shouldn't be such a difficult task to accomplish, however nothing seems to work even though everything seems to point that my client should rank higher. So now I'm running out of ideas regarding what to do with this site. Any insight you could provide would be SO helpful to me and my client. If needed I can provide my client's homepage URL and also their competitors homepage for you to review. i can also give you any extra information you need. Thanks a lot!
Intermediate & Advanced SEO | | EduardoRuiz0 -
Disavowel tool.
Hello, One of my sites has a strange link profile; it has 40000 in bound links but 30000 of them are from the site http://ourlipsaresealed.skynetblogs.be/ with the anchor text "haarstijl (2)" which is dutch for hairstyles. I haven't paid for or even asked for these links and I don't think its negative seo. I think they just set up a template with hundreds of links they thought were useful to their visitors and produce several pages a day. So the question is do I use the new google disavowel tool? I've held off so far because A. they link to a competitor who haven't been anywhere near as affected as we have although they seem to have been affected to an extent by a drop for some reason and they have a much better link profile overall than mine. and B. in the video Matt cutts goes on over and over that this tool is for people that have done some dodgy link building in the past but I haven't. Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
Help needed on Google Webmaster tools
Hi I notice that one of my oldest sites , even if I put hundred of backlinks (good or bad) google webmaster tools never index more like 20 per day. Why is this happening? They control it? I mean they dont let them all to get indexed and they take it slowly slowly? If I put just 20 per day is the ideal link building amount? Thnk you
Intermediate & Advanced SEO | | nyanainc0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Getting rid of a site in Google
Hi, I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google. Google reports millions of inbound links from Site B to Site A I want to get rid of Site B, because its duplicate content. First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months) I also tried to change all the pages on Site B to 404 pages, but this did not work either I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase. What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases Any suggestions?
Intermediate & Advanced SEO | | JacoRoux0 -
Google, Links and Javascript
So today I was taking a look at http://www.seomoz.org/top500 page and saw that the AddThis page is currently at the position 19. I think the main reason for that is because their plugin create, through javascript, linkbacks to their page where their share buttons reside. So any page with AddThis installed would easily have 4/5 linbacks to their site, creating that huge amount of linkbacks they have. Ok, that pretty much shows that Google doesn´t care if the link is created in the HTML (on the backend) or through Javascript (frontend). But heres the catch. If someones create a free plugin for wordpress/drupal or any other huge cms platform out there with a feature that linkbacks to the page of the creator of the plugin (thats pretty common, I know) but instead of inserting the link in the plugin source code they put it somewhere else, wich then is loaded with a javascript code (exactly how AddThis works). This would allow the owner of the plugin to change the link showed at anytime he wants. The main reason for that would be, dont know, an URL address update for his blog or businness or something. However that could easily be used to link to whatever tha hell the owner of the plugin wants to. What your thoughts about this, I think this could be easily classified as White or Black hat depending on what the owners do. However, would google think the same way about it?
Intermediate & Advanced SEO | | bemcapaz0 -
Getting a site to rank in both google.com and google.co.uk
I have a client who runs a yacht delivery company. He gets business from the US and the UK but due to the nature of his business, he isn't really based anywhere except in the middle of the ocean somewhere! His site is hosted in the US, and it's a .com. I haven't set any geographical targeting in webmaster tools either. We're starting to get some rankings in google US, but very little in google UK. It's a small site anyway, and he'd prefer not to have too much content on the site saying he's UK based as he's not really based anywhere. Any ideas on how best to approach this?
Intermediate & Advanced SEO | | PerchDigital0