Should I use the Disavow Tool at this point?
-
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us.
My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet?
I have done everything possible to get them removed, and it's not happening.
-
No, I've not received a notification. Thanks, Sean.
-
Totally feel your pain, Marisa. It's frustrating to see the decline because of someone else's shoddy work. However, it's probably a better fate then a full-blown penguin penalty, plus it shows that your other links must be great if the page has only dropped to #11. What a great feeling it will be too when you hit back to #1 - knowing that everything you've done, Google loves.
And plus, just checked your rank again and you've bopped back to #10 on my SERP, so the comeback is underway!
-
I would not be disavowing your links at this stage, you do not say if you received a notification from Google or not in your WMT.
If you did then take action to remove and then disavow if they can not be removed, but if you have not received the notification / warning then keep the links in place.
Build some good quality links and add new content.
I hope this helps
Sean
-
Tom,
Thanks for your advice. It seems logical, and now I think I remember reading that somewhere. it kinda sucks, though, because now all I can do is watch my rankings steadily decline and be powerless to stop it. I guess I need to hope I earn enough good links to drown the rest of those out. -
Hi Marisa
Unless you've received a notification in your WMT saying you've received a penalty, then I wouldn't use the disavow tool. Google has been quite insistent that it should only be used as a last resort at trying to remove links as part of a reconsideration request. I don't believe it should be used a precautionary measure, nor would it have any effect unless your site is under a penalty.
I'd actually be quite optimistic about this. What I think we've seen over the last month is Google getting a lot better at discounting links as the algorithm updates. This post put over on Inbound explains it quite well - it looks as though Google is aiming towards continuous devaluation of links. I'm wondering whether this is the case for your site.
What could have happened is that the algorithm has looked at some of the links pointing towards your homepage (as that's the page ranking for that term) and seen a few links and thought "nah, these are crap. Gonna remove their value". With it being a homepage link, this might be a fair old few of older, spammier links the previous SEO put in place. This is consistent with what I've seen with a few sites in the UK - a quite sharp (but not huge) drop, all for keywords ranking for a particular page (usually the homepage).
Now, if this is the case, then I'd say it's a great leap forward by Google Devaluing links on the fly could lead to less dramatic drops and clean ups in the future. I'm fairly sure you wouldn't have had a penalty, so therefore removing those few bad links you have would probably have been futile anyway - but especially so now if Google has devalued them.
All I'd recommend Marisa is continuing with the quality links you've been putting in place already. You may have lost the value of a certain few recently, but all in all it's a good thing as they were probably a bit spammy anyway. I certainly wouldn't waste time disavowing if you haven't experienced a penalty.
Just my 2 copper, but I hope it helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is using REACT SEO friendly?
Hi Guys Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO? Many thanks for your help in advance. Cheers Martin
Algorithm Updates | | martin19700 -
38% of SEOs Never Disavow Links: Are you one among them or the other 62%?
Hi all, Links disavowing is such a advanced tasks in SEO with decent amount of risk involved. I thought many wouldn't follow use this method as Google been saying that they try to ignore bad links and there will be no penalty for such bad links and negative SEO is really a rare case. But I wondered to see only 38% SEOs never used this method and other 62% are disavowing links monthly, quarterly or yearly. I just wonder do we need to disavow links now? It's very easy to say to disavow a link which is not good but difficult to conclude them whether they are hurting already or we will get hurt once they been disavowed. Thanks Screenshot_3.jpg
Algorithm Updates | | vtmoz1 -
Are links from inside duplicate content on a 3rd party site pointing back to you worthwhile.
In our niche there are lots of specialist 'profile / portfolio' sites were we can upload content (usually project case studies. These are often quite big and active networks and can drive decent traffic and provide links from high ranking pages. The issue im a bit stuck on is - because they are profile / portfolio based usually its the same content uploaded to each site. But im beginning to get the feeling that these links from within duplicate content although from high ranking sites are not having an effect. Im about to embark on a campaign to re rewrite each of our portfolio items (each one c. 400 words c. 10 times) for each different site, but before i do i wandered if any one has had any experience / a point of view on with wether Google is not valuing links from within duplicate content (bare in mind these arnt spam sites, and are very reputable, mainly because once you submit the content it gets reviewed prior to going live). And wether a unique rewrite of the content solves this issue.
Algorithm Updates | | Sam-P0 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0 -
Is using WPML (WordPress Multilingual Plugin) ok for On-Page SEO?
Hi Mozzers, I'm investigating multilingual site setup and translating content for a small website for 15-20 pages and came accross WPML (WordPress Multilingual Plugin) which looks like it could help, but I am curious as to whether it has any major international SEO limitations before trialing/buying. It seems to allow the option to automatically setup language folder structures as www.domain.com/it/ or www.domain.com/es/ etc which is great and seems to offer easy way of linking out to translators (for extra fee), which could be convenient. However what about the on-page optimization - url names, title tags and other onpage elements - I wonder if anyone has any experiences with using this plugin or any alternatives for it. Hoping for your valued advice!
Algorithm Updates | | emerald0 -
Using ™ and ® in page titles
Is it bad to use registered trademark symbols in page titles? Does this somehow hurt in search rankings?
Algorithm Updates | | mlentner0