750,000 pv/month due to webspam. What to do?
-
Let's say your user-generated content strategy is wildly successful, in a slightly twisted sense: webspammers fill it with online streaming sports teasers and the promise of "Weeds season 7 episode 11." As a result of hard SEO work done to build the profile of the domain, these webspam pages seem to rank well in Google, and deliver nearly 750k pageviews, and many many unique visitors, to the site every month.
The ad-sales team loves the traffic boost. Overall traffic, uniques, and search numbers look rosy.
What do you do?
a) let it ride
b) throw away roughly half your search traffic overnight by deleting all the spam and tightening the controls to prevent spammers from continuing to abuse the site
There are middle-ground solutions, like using NOINDEX more liberally on UGC pages, but the end result is the same as option (b) even if it takes longer to get there.
-
You seem to have a clear understanding of the situation. You are making the conscious choice to continue with your current business practices. It makes sense.
You have a monetary incentive to capture as much traffic as possible due to advertising revenue. As EGOL suggested, I believe the best paying advertisers will recognize your traffic as low quality and either choose not to advertise on your site or pay substantially less then they would for a similar ad on a better site.
You also run the risk of losing many users. Humans don't like spam sites and will leave them for better sites. Additionally Panda changes will surely make it harder for your site to rank on it's legitimate content.
Feel free to disregard this advice. I predict at some point in the not-to-distant future you will lose your advertisers or your traffic. The amount of effort you spend trying to get either back will ensure you never travel down this path again.
-
Ryan - not half the site's traffic, but half the site's search traffic. And even that is an exaggeration. Webspam search traffic accounts for 28% of overall search traffic.
EGOL - I would say no to the question of robot visitors, because on the instances we checked -- in which spammers used a bit.ly URL for their outbound link -- we were able to measure an astounding 47% clickthrough rate from our site to the spam destination. I would not expect bots to click through.
Also, we use nofollow on all outbound links in user-generated content. I guess that is not a guarantee that we would not be penalized fro hosting a linkfarm, but shouldn't it be?
If it were up to me, I'd wipe out the webspam entirely. But it's not an easy sell. This content delivers ~750,000 pageviews, ~150k ad views, and probably 100k unique visitors per month, plus the small risk that one day G might penalize us for it. It's not pills, porn, gambling, mortgages, and all the links are nofollowed. The people making this decision don't see a smoking gun.
-
I have two concerns....
Are you getting a lot of robot visitors instead of human visitors? If you are getting lots of robots then those visits will not be valuable to your advertisers and they will eventually stop paying to appear on your site. The best advertisers are really smart about this.
Are these sports teaser posts accompanied by links to other websites. If that is happening I would cut them off right away because they are probably making you a linkfarm for spammy websites.
-
The problem you face is by allowing spam, your real users will be unhappy. Your main site visitors may leave your site for another, spam-free site. It is likely you have already permanently lost some traffic due to the spam.
Presently you describe your site as 50% spam traffic, 50% real traffic. Two things will likely happen over time. Google will recognize your site is spammy and will penalize it in some format. Also your users will become unhappy with your site and the ratio of your site's visitors will change to being more spam traffic. Once that happens, I anticipate a fast decline.
I suggest option B as in your best interests for long term benefit of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any more info on potential Google algo update from April 24th/25th?
Apart from an article on Search Engine Roundtable, I haven’t been able to find anything out about the potential algorithm update that happened on Monday / Tuesday of this week. One of our sites (finance niche) saw drops in rankings for bad credit terms on Tuesday, followed by total collapse on Wednesday and Thursday. We had made some changes the previous week to the bad credit section of the site, but the curious thing here is that rankings for bad credit terms all over the site (not just the changed section) disappeared. Has anyone else seen the impact of this change, and are there any working theories on what caused it? I’m even wondering whether a specific change has been made for bad credit terms (i.e. the payday loan update)?
White Hat / Black Hat SEO | | thatkinson0 -
Competitor Inbound Links Increase from 175K to 1 million in 1 month, how?
Hi all, I was recently doing some competitive analysis on external links/DA and came across something peculiar. A competitor of ours had their external links go from 175,179 in August to 1,141,365 in September. I've attached a screenshot showing the increase. The competitors domain authority also increased from 82 to 89 in the same time span. Has anyone else come across such a large link increase in such a short period of time, while also being rewarded for it? Obviously at first glance it seemed extremely black hat and unnatural, but I would love to be proven wrong. Thanks! Cw5tN
White Hat / Black Hat SEO | | mstpeter0 -
Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles. The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course. Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect... Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later. However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions. My question is quite simple(I wish)... What gives? I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site? QfkSttI T42oGqA
White Hat / Black Hat SEO | | snowflake740 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
New Service/Product SEO and rankings
Hello, fellow MOZers. We are a web design company, and we had SEO as secondary service for years. Due to changes in the company we started pushing SEO as one of our main services about 6 monhs ago. We have separate page , targeting that service, as well as case studies, supportive information pages, even SEO Center, which is like a blog about SEO only. We are not using black hat SEO, doing honest link earning and building, don't use keyword stuffing, everything is by the book. I understand that SEO takes time, especially for a company which has a footprint as web design company, not as SEO company. We are ranking very good for web design related keyphrases, however, we don't see any improvements for SEO related keywords. It always was and is between 25-30 SERP. At the same time, competitors, who are ranking on first page for SEO related phrases are pretty bad looking. Design-wise as well as blackhat-SEO-wise. Everything is keyword stuffed, UX is horrible, prices are ridiculous. So, do you guys have any thought/advise on how we can see results / why we are not seeing results. Links: Google search result: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=seo%20houston Competitors: www.seohouston.com, www.graphicsbycindy.com Our pages: https://www.hyperlinksmedia.com/seo-houston.php, https://www.hyperlinksmedia.com/seo-houston/
White Hat / Black Hat SEO | | seomozinator0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0