Negative SEO attack working amazingly on Google.ca
-
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc.
I submitted a disavow link file, as this was obviously an attack on the website.
Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now.
I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time.
'.'< --Thanks!
-
Thank you for that, i will have a look now. I know when we have looked at our links we have found a lot of people link to our site, but trying to decide if these are good links or bad links has been hard for us.
We did find a lot of directory sites and also game sites that were linking to us and we could not understand why they were doing this.
-
Zack,
They're actually linking to a dynamic inventory page which is updated daily with new products. Sometimes they link to a specific piece of inventory which is ultimately 404'd when the product is sold and link doesn't exist anymore.
Wish it was that easy, but the links keep coming
-
Hi Tim,
If you log into webmaster tools and look at Traffic > Links to your site you can see who's linking. Make sure you click on "More>>". Alternatively you can use Moz's OSE and run a backlink report on your site.
It was immediately clear that this was happening again as we had 26 links from videogamersoasis.com for a power sports website...
To submit a file:
Webmaster Tools > Traffic > Links to your site, click on "Download latest links" and then filter out any links you actually put there. Go to https://www.google.com/webmasters/tools/disavow-links-main?pli=1 and submit the file.
Only do this is you KNOW you've been attacked and you can't contact the website owners to take the links down.
-
i think the same is happening to our site, can you let me know how you find the spamy link and how to submit the file. also have you tried filling in the contact form for google, i know they are not great at communicating but maybe worth a shot
-
That sucks. If I were you I would probably would not submit a new disavow link file, because the first one is probably still in their queue.
How many different URLs were the spammy links pointing to ? If it's just a few, could you just no-index the tainted goods and create new URLs ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does Google's Spammy Structured Markup Penalty consist of?
Hey everybody,
White Hat / Black Hat SEO | | klaver
I'm confused about the Spammy Structured Markup Penalty: "This site may not perform as well in Google results because it appears to be in violation of Google's Webmaster Guidelines." Does this mean the rich elements are simply removed from the snippets? Or will there be an actual drop in rankings? Can someone here tell from experience? Thanks for your help!1 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
Alltop good for SEO?
Are there any negative effects on getting your blog posted on alltop? Good SEO value or not?
White Hat / Black Hat SEO | | DemiGR0 -
Old SPAM tactic still works and gets TOP 3 in SERP?
Hi Mozers, Below you can see some examples of spam ( hidden text and sneaky redirects) which are in SERP for our branded keywords during last 3 months. Some of them occupy very high position in SERP (top 3/top5). https://www.google.com/search?num=100&newwindow=1&safe=off&biw=1883&bih=1028&q=%22your+mac+-%22%2B%22cleanmymac%22 I sent spam reports and I’m going to continue doing so. (~500 spam reports from personal and work google account) I contacting directly with some of the hacked sites (web-masters) and tried to help them to fix this issue, but it takes a lot of my time. But 3 months!? Can you give me any advice, what doing next? Thank you!
White Hat / Black Hat SEO | | MacPaw0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
SEO for location outside major city
Hello, I'm hoping to get some opinions on optimising a site for a client based 30 minutes outside of Dublin. Obviously there is a higher search volume for "x in Dublin" than "x in small town". What do you think the best strategies are for incorporating the "Dublin" into keywords? For example is it OK to use phrases like "x near Dublin" or "x in Greater Dublin", or do you think this is a bit misleading? The client in question sells good online, so the customer wouldn't physically have to visit the store. Thanks!
White Hat / Black Hat SEO | | gcdtechnologies0 -
What Google considers to be a branded keyword?
We can set our own keywords as branded in SeoMoz campaign, but Google would not necessarily see them like branded. After reading the Blog post at http://www.seomoz.org/blog/how-wpmuorg-recovered-from-the-penguin-update I had a question: Are there known rules (or at least guesses) what Google considers a branded keyword/anchor text? I guess the first one would be your website domain. So bluewidget.com for example would be a branded keyword for bluewidget.com website. How about Blue Widget or Blue Widget Company?
White Hat / Black Hat SEO | | SirMax0