Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to check if a site is doing blackhat SEO?
-
Thanks in advance!
-
It really depends on what you define as blackhat. On-page trickery (cloaking, redirects for search engines bots, etc.) can be discovered by browsing as a search bot, digging into code, viewing caches, etc. Danny Sullivan and Rand uncovered a large amount of cloaked (and stolen) content on stage at SMX Sydney a few years ago. It was quite entertaining at the time
Some people are basic enough to use tactics like hidden, white-on-white text, as Martijn says. I'm yet to see that tactic actually working post-2004 though
If it's links they're using, the easiest way is to use a tool like Open Site Explorer, Ahrefs or similar to check the links out. Sneaky people can block the OSE / Ahrefs / MajesticSEO bots from crawling the sources of their backhat links if they have access to the linking sites. You can block the bots either in robots.txt or by rejecting the visits to stop the bots from noting that the links exist. That way, the backlink analysis tools will never see that blackhatsite.com links to rankingsite.com, and so forth. It takes a big network that the spammer controls to block link research tools' bots' access to every link you build, however, so this isn't too common.
Whether all big brands / well ranked sites are using blackhat tactics pretty much depends on your definition of blackhat, but it's certainly true that it is very hard if not impossible to rank top 3 for competitive terms (car insurance, poker, credit cards) without parting with money that results in links being built. This doesn't mean that they're all buying links, but they're definitely investing in marketing that results in links, and the whitest of the whitehats will say that this is technically not organic, natural link development. It is, however, what we do - marketing.
-
Why does it matter?
-
An even easier way is to check their rankings - if they're top 3 for big money terms in their niche, they're probably using some blackhat tactics. Even the whitest of whitehats are still using some blackhat tactics in the background, despite people not wanting to admit it.
-
I can't agree more with Gary, we probably need some more information to know what kind of black hat you're possibily dealing with. One of the first things I tend to look at trying to find out if the site is using some ways of black hat tactics are:
- Backlink profile, if the quality of links is low or certain percentages between follow/ nofollow links are different then it could be a sign.
- Look at the site with Google as a user agent and see if the site is showing different information then to a real user.
- Just do a select all on the site to see if they hide any content (yup, still happens).
-
Your question is a bit to open ended, what do you want to achieve by knowing this information.
Does a site rank better than you?
Are they doing negative seo to other people?
Do they steal content from people?Are they building links as dofollow from places they should not?Too many questions to ask before answering such a vague answer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound links to internal search with pharma spam anchor text. Negative seo attack
Suddenly in October I had a spike on inbound links from forums and spams sites. Each one had setup hundreds of links. The links goes to WordPress internal search. Example: mysite.com/es/?s=⚄
White Hat / Black Hat SEO | | Arlinaite470 -
What is the proper URL length? in seo
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google. but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me? my competitors have 8 characters domain url and keywords length of 13 and my site has 15 character domain url and keywords length of 13 which one will be prefered by google.
White Hat / Black Hat SEO | | calvinkj0 -
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100 -
Do pingbacks in Wordpress help or harm SEO? Or neither?
Hey everyone, Just wondering, do pingbacks in Wordpress help or harm SEO? Or neither?
White Hat / Black Hat SEO | | jhinchcliffe1