Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
-
Here's the situation...
There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.)
Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product
I've seen traffic to my site dropping but I don't have a warning in GWT.
These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try).
I totally understand that the site linking to me may not have any affect on my current traffic.
So should I use the Disavow tool to make sure this site isn't counting against me?
-
Thanks. It seems there are so many opinions on disavow it's hard to know what's right. A lot of people say to only use it when you get a GWT warning but others say it's OK as a preventative measure.
I think I'm going to put together a list of sites that I know are garbage pointing to me and disavow them.
-
As Moosa said, try and get them to take the articles down first because this would be better and Google says that you should try to get them taken down BEFORE using the disavow tool. If you have already tried then go ahead and use it, you can disavow links from an entire domain so you can just do that.
-
The links are not build by you... you are sure that the link of link that is pointing back from certain URL is a bad link and you have tried everything to remove those links but failed... now the only option you have left is to use a disavow tool so go for it!
It is important not to use disavow tool when you didn't tried removing the bad links manually but if you did attempt and failed then you should go with option left with you that is using a Disavow tool!
-
Use it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about spammy links to 404 Pages we never created ...
FYI I'm a beginner within the company, so this might be a basic question, but ...I was going through open site explorer and checking www.partnermd.com for opportunities to reclaim links and I found a bunch of 404 pages that we never created that had nothing to do with the business. Out of curiousity, I plugged in one of the weird links like this one:http://www.partnermd.com/images/2015-best-space-heater-best-wers.html into open site explorer and found several bad spammy links pointing to it. When I clicked on one of them I got a notice that the site might have been hacked.I did some research and it looks like Google doesn't penalize you for spammy links to 404 pages, but how do we prevent this from occurring in the first place if possible?
Technical SEO | | WhittingtonConsulting1 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Should I use Event Schema for a page that reports on an event?
I have a question about using Schema data. Specifically: Should I use Event Schema for a page that reports on an event? I provide high-quality coverage (reporting) about new products being introduced at an industry trade show. For the event, I create a single page using the event name, and provide a great deal of information on how to attend the show, the best places to stay and other insider tips to help new attendees. Then during the show, I list the new products being introduced along with photos and videos. Should I use event schema data for this page, or does Google only want the event organizer to use that data? Any benefits or drawbacks to using event schema? Thanks! Richard
Technical SEO | | RichardInFlorida0 -
How to handle mobile site with less pages than the main site?
We are developing a mobile version of our website that will utilize responsive design/dynamic serving. About 70% of the main website will be included in the mobile version. What (if anything) should be the redirect for pages not included in the mobile version of the site? Also - for one specific section users will be redirected from that page to the homepage, what is the redirect that should be used for this? Thanks!
Technical SEO | | theLotter0 -
/forum/ or /hookah-forum/
I'm building a new website on Hookah.org. It will have a forum and blog. Should I put them in Hookah.org/hookah-forum/ and Hookah.com/hookah-blog/ or Hookah.org/forum and Hookah.org/blog I think /forum/ and /blog/ are easier for users but am not sure how much adding the word hookah helps with SEO.
Technical SEO | | Heydarian0 -
One landing page with lots of content or content hub?
Interested in getting some opinions on if it's better to build one great landing page with tons of content or build a good landing page and build more content (as blog posts?) and interlink them back to the landing/hub page? Thoughts and opinions? Chris
Technical SEO | | sanctuarymg0 -
Webmaster Tools 404 Errors Pages Never Created
Recently, 196 404 errors appeared in my WMT account for pages that were never created on my site. Question: Any thoughts on how they got there (i.e. WMT bug, tactic by competitor)? Question: Thoughts on impact if any? Question: Thoughts on resolution?
Technical SEO | | Gyi0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0