750,000 pv/month due to webspam. What to do?
-
Let's say your user-generated content strategy is wildly successful, in a slightly twisted sense: webspammers fill it with online streaming sports teasers and the promise of "Weeds season 7 episode 11." As a result of hard SEO work done to build the profile of the domain, these webspam pages seem to rank well in Google, and deliver nearly 750k pageviews, and many many unique visitors, to the site every month.
The ad-sales team loves the traffic boost. Overall traffic, uniques, and search numbers look rosy.
What do you do?
a) let it ride
b) throw away roughly half your search traffic overnight by deleting all the spam and tightening the controls to prevent spammers from continuing to abuse the site
There are middle-ground solutions, like using NOINDEX more liberally on UGC pages, but the end result is the same as option (b) even if it takes longer to get there.
-
You seem to have a clear understanding of the situation. You are making the conscious choice to continue with your current business practices. It makes sense.
You have a monetary incentive to capture as much traffic as possible due to advertising revenue. As EGOL suggested, I believe the best paying advertisers will recognize your traffic as low quality and either choose not to advertise on your site or pay substantially less then they would for a similar ad on a better site.
You also run the risk of losing many users. Humans don't like spam sites and will leave them for better sites. Additionally Panda changes will surely make it harder for your site to rank on it's legitimate content.
Feel free to disregard this advice. I predict at some point in the not-to-distant future you will lose your advertisers or your traffic. The amount of effort you spend trying to get either back will ensure you never travel down this path again.
-
Ryan - not half the site's traffic, but half the site's search traffic. And even that is an exaggeration. Webspam search traffic accounts for 28% of overall search traffic.
EGOL - I would say no to the question of robot visitors, because on the instances we checked -- in which spammers used a bit.ly URL for their outbound link -- we were able to measure an astounding 47% clickthrough rate from our site to the spam destination. I would not expect bots to click through.
Also, we use nofollow on all outbound links in user-generated content. I guess that is not a guarantee that we would not be penalized fro hosting a linkfarm, but shouldn't it be?
If it were up to me, I'd wipe out the webspam entirely. But it's not an easy sell. This content delivers ~750,000 pageviews, ~150k ad views, and probably 100k unique visitors per month, plus the small risk that one day G might penalize us for it. It's not pills, porn, gambling, mortgages, and all the links are nofollowed. The people making this decision don't see a smoking gun.
-
I have two concerns....
Are you getting a lot of robot visitors instead of human visitors? If you are getting lots of robots then those visits will not be valuable to your advertisers and they will eventually stop paying to appear on your site. The best advertisers are really smart about this.
Are these sports teaser posts accompanied by links to other websites. If that is happening I would cut them off right away because they are probably making you a linkfarm for spammy websites.
-
The problem you face is by allowing spam, your real users will be unhappy. Your main site visitors may leave your site for another, spam-free site. It is likely you have already permanently lost some traffic due to the spam.
Presently you describe your site as 50% spam traffic, 50% real traffic. Two things will likely happen over time. Google will recognize your site is spammy and will penalize it in some format. Also your users will become unhappy with your site and the ratio of your site's visitors will change to being more spam traffic. Once that happens, I anticipate a fast decline.
I suggest option B as in your best interests for long term benefit of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to rank with this page http://www.servicesarab.com/%D9%86%D9%82%D9%84-%D8%B9%D9%81%D8%B4-%D8%A7%D9%84%D9%83%D9%88%D9%8A%D8%AA/
i want to rank with this page http://www.servicesarab.com/%D9%86%D9%82%D9%84-%D8%B9%D9%81%D8%B4-%D8%A7%D9%84%D9%83%D9%88%D9%8A%D8%AA/
White Hat / Black Hat SEO | | saharali150 -
New Service/Product SEO and rankings
Hello, fellow MOZers. We are a web design company, and we had SEO as secondary service for years. Due to changes in the company we started pushing SEO as one of our main services about 6 monhs ago. We have separate page , targeting that service, as well as case studies, supportive information pages, even SEO Center, which is like a blog about SEO only. We are not using black hat SEO, doing honest link earning and building, don't use keyword stuffing, everything is by the book. I understand that SEO takes time, especially for a company which has a footprint as web design company, not as SEO company. We are ranking very good for web design related keyphrases, however, we don't see any improvements for SEO related keywords. It always was and is between 25-30 SERP. At the same time, competitors, who are ranking on first page for SEO related phrases are pretty bad looking. Design-wise as well as blackhat-SEO-wise. Everything is keyword stuffed, UX is horrible, prices are ridiculous. So, do you guys have any thought/advise on how we can see results / why we are not seeing results. Links: Google search result: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=seo%20houston Competitors: www.seohouston.com, www.graphicsbycindy.com Our pages: https://www.hyperlinksmedia.com/seo-houston.php, https://www.hyperlinksmedia.com/seo-houston/
White Hat / Black Hat SEO | | seomozinator0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Link Building / Link Removal
Hey, I'm in the process of learning SEO, or attempting to, for my company and am chipping away at the process ever so slowly! How can I tell if a site that links to my company's site, www.1099pro.com, has a negative effect on my page/domain authority? Also, if a page doesn't show up in the search rankings at all for it's keywords when it really should (i.e it has the exact keywords and page/domain authority far surpasses even the top results) how can I tell if Google has removed the page from its listing and why? Thanks SEO Gurus
White Hat / Black Hat SEO | | Stew2220 -
Do you choose PA/DA over PR when purchasing expiring domains?
Hey guys, So a lot has been said about private blog network. I have but only 1 question: Do you choose PA/DA over PR when purchasing expiring domains or PR is most critical? Thanks a lot!
White Hat / Black Hat SEO | | nicenike0 -
Looking for recent bad SEO / black hat example such as JC Penney example from 2011
I am giving a presentation in a few weeks and looking for a "what not to do" larger brand example that made poor SEO choices to try and game Google with black hat tactics. Any examples you can point me to?
White Hat / Black Hat SEO | | jfeitlinger0 -
Are paid reviews gray/black hat?
Are sites like ReviewMe or PayPerPost white hat? Are follow links allowed within the post? Should I use those aforementioned services, or cold contact high authority sites within my niche?
White Hat / Black Hat SEO | | 10JQKAs0