Should I use the Disavow Tool at this point?
-
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us.
My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet?
I have done everything possible to get them removed, and it's not happening.
-
No, I've not received a notification. Thanks, Sean.
-
Totally feel your pain, Marisa. It's frustrating to see the decline because of someone else's shoddy work. However, it's probably a better fate then a full-blown penguin penalty, plus it shows that your other links must be great if the page has only dropped to #11. What a great feeling it will be too when you hit back to #1 - knowing that everything you've done, Google loves.
And plus, just checked your rank again and you've bopped back to #10 on my SERP, so the comeback is underway!
-
I would not be disavowing your links at this stage, you do not say if you received a notification from Google or not in your WMT.
If you did then take action to remove and then disavow if they can not be removed, but if you have not received the notification / warning then keep the links in place.
Build some good quality links and add new content.
I hope this helps
Sean
-
Tom,
Thanks for your advice. It seems logical, and now I think I remember reading that somewhere. it kinda sucks, though, because now all I can do is watch my rankings steadily decline and be powerless to stop it. I guess I need to hope I earn enough good links to drown the rest of those out. -
Hi Marisa
Unless you've received a notification in your WMT saying you've received a penalty, then I wouldn't use the disavow tool. Google has been quite insistent that it should only be used as a last resort at trying to remove links as part of a reconsideration request. I don't believe it should be used a precautionary measure, nor would it have any effect unless your site is under a penalty.
I'd actually be quite optimistic about this. What I think we've seen over the last month is Google getting a lot better at discounting links as the algorithm updates. This post put over on Inbound explains it quite well - it looks as though Google is aiming towards continuous devaluation of links. I'm wondering whether this is the case for your site.
What could have happened is that the algorithm has looked at some of the links pointing towards your homepage (as that's the page ranking for that term) and seen a few links and thought "nah, these are crap. Gonna remove their value". With it being a homepage link, this might be a fair old few of older, spammier links the previous SEO put in place. This is consistent with what I've seen with a few sites in the UK - a quite sharp (but not huge) drop, all for keywords ranking for a particular page (usually the homepage).
Now, if this is the case, then I'd say it's a great leap forward by Google Devaluing links on the fly could lead to less dramatic drops and clean ups in the future. I'm fairly sure you wouldn't have had a penalty, so therefore removing those few bad links you have would probably have been futile anyway - but especially so now if Google has devalued them.
All I'd recommend Marisa is continuing with the quality links you've been putting in place already. You may have lost the value of a certain few recently, but all in all it's a good thing as they were probably a bit spammy anyway. I certainly wouldn't waste time disavowing if you haven't experienced a penalty.
Just my 2 copper, but I hope it helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What tools and metrics do you use to show a topic's search interest over time?
I have a foundation repair client that is down in leads for the structural repair portion of their business. They have not lost any major rankings, but leads are down compared to last year. They asked if people are searching for this type of work less this year compared to last. I checked Google Trends and Keyword Planner data but found very different results. Is either of these tools accurate, or is there a better tool to use?
Algorithm Updates | | DigitalDivision1 -
Optimising meta tags: How to write them perfectly without duplicating? Impact of using different keywords?
Hi friends, Generally most of the articles about tags are either title rag or header tags, but not about both. I would like to know how to write perfect title and header tags. How much they must be relevant and different? Can we use the same tags for title and H1? If we are planning to rank for different keywords, can that different keywords can be used? I'm really curious to see some interesting answers for this. Thanks
Algorithm Updates | | vtmoz0 -
Does Google use dateModified or date Published in its SERPs?
I was curious as to the prioritization of dateCreated / datePublished and dateModified in our microdata and how it affects google search results. I have read some entries online that say Google prioritizes dateModified in SERPs, but others that claim they prioritize datePublished or dateCreated. Do you know (or could you point me to some resources) as to whether Google uses dateModified or date Published in its SERPs? Thanks!
Algorithm Updates | | Parse.ly0 -
Should I use subdomains?
I'm thinking of a little project website, but wonder whether I should use subdomains, or just simply categorize the site. For example, (I haven't chosen my domain yet) If I had www.flowers.com, and wanted to produce pages for each type of flower, should i use rose.flower.com
Algorithm Updates | | Gordon_Hall
or
flower.com/rose For SEO purposes, or usability, does it matter? Thanks in advance.0 -
Proactively Use GWT Removal Tool?
I have a bunch of links on my site from sexualproblems.net (not a porn site, it's a legit doctor's site who I've talked to on the phone in America). The problem is his site got hacked and has tons of links on his homepage to other pages, and mine is one of them. I have asked him multiple times to take the link down, but his webmaster is his teenage son, who doesn't basically just doesn't feel like it. My question is, since I don't think they will take the link down, should I proactively remove it or just wait till I get a message from google? I'd rather not tell google I have spam links on my site, even if I am trying to get them removed. However, I have no idea if that's a legitimate fear or not. I could see the link being removed and everything continuing fine or I could see reporting the removal request as signaling a giant red flag for my site to be audited. Any advice? Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0 -
Search bots that use referrers?
Can someone point me to a list or just tell me specific search bots that use referrers?
Algorithm Updates | | BostonWright0