How do I find out if a competitor is using black hat methods and what can I do about it?
-
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry.
They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors.
It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
-
Kevin, can you give us any type of update as to how this turned out? Did the rankings drop back after a bit, or are they still there? Did you figure out if they were using any shady tactics?
-
There are a lot of analysis you need to do to find out whether a person is doing BH on their site (just like NY Times hiring a SEO company to research on JC Penny) as there are a lot of plays you can play around with to try and trick Google.
Just some examples:
Onsite
- Are they stuffing keywords?
- Are they using doorway pages?
Offpage
- Are they buying links?(Most common)
- Are they spamming blog comments?
At the end of the day, if you find out that they are actually using BH method, you can report them from Google Webmaster Tools, make sure you put in all your evidence as they would not spend the time analyising each request.
-
Hi Kevin,
It is really hard to assume that a website is using BH unless I am able to check them myself, but for starters (which I believe you already did) you can always check the FF: PR, Links (inbound & outbound) use yahoo for this, use the tools in SEOmoz for this too to determine the quality of links.
You see it could also mean that those websites are of diff. IP's and each site is linking to authority domains, needless to say but you also have to check your website if it is really optimized and the duration thereof for its caching.
Now this is a handful but I am excited as to the outcome of this.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
302 query - can someone help
If I were to put 302s on several reasonably ranked landing pages to drive more traffic/conversions for a period of one week to a particular page, would the pages with 302s drop from their positions in the SERPS? And is this a bad idea? I want to try and drive some conversions over the next month for a particular page… Thanks for your help!
White Hat / Black Hat SEO | | Jacksons_Fencing0 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Competitors Building Bad Back Links
Hi there, I recently checked the back links for my site using Open Site Explorer, and I noticed a huge number of bad back links which I believe a competitor might be building to help lower my ranking for a number of highly competitive keywords. Besides spending time disavowing these links, what else can be done? Has anyone else been faced with the same problem? Any help would be appreciated. cXT0lvd.jpg
White Hat / Black Hat SEO | | bamcreative0 -
Benefit of using 410 gone over 404 ??
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone. Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change. I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case? Or, when would you use a 410 gone? Thanks
White Hat / Black Hat SEO | | bjs20100 -
Competitors Developing Spammy Link For My Website
Well Guys there are lot of discussions in almost all the communities, blogs, forums about Post Penguin impact. Google says that if find that you're involved in any link building activities, we may penalize you. People out there have already started their developed links. But what if our competitors would have developed those links. Initially it was okay to develop one way links, I even developed lot of quality, but deliberately, links. around 95% links are placed manually, if return to some favor or money but all links looks natural. Most of the links I developed through content only, like articles, blog comments, PR submission, etc now really skeptical about the quality (after hearing lot of talks and reading n number of posts). Now, can I also submit my competitor's websites in 1000 topic directory (obviously not in any spammy directory), would it effect that website adversely? What if I spun an existing content and submit it into 500 article directories and give backlink to competitor site from using only one anchor text (which is obviously the main keywords - highest sales generating keyword) I look forward to some experts comments.
White Hat / Black Hat SEO | | Khem_Raj70 -
Competitors and Duplicate Content
I'm curious to get people's opinion on this. One of our clients (Company A) has a competitor that's using duplicate sites to rank. They're using "www.companyA.com" and "www.CompanyAIndustryTown.com" (actually, several of the variations). It's basically duplicate content, with maybe a town name inserted or changed somewhere on the page. I was always told that this is not a wise idea. They started doing this in the past month or so when they had a site redesign. So far, it's working pretty well for them. So, here's my questions: -Would you address this directly (report to Google, etc.)? -Would you ignore this? -Do you think it's going to backfire soon? There's another company (Company B) that's using another practice- using separate pages on their domain to address different towns, and using those as landing pages. Similar, in that a lot of the content is the same, just some town names and minor details changed. All on the same domain though. Would the same apply to that? Thanks for your insight!
White Hat / Black Hat SEO | | DeliaAssociates0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0