My competitors are using blackhat. What should i do.?
-
My competitors are using on page black hat methods
They are using like
keyword stuffing
What should i do.?
-
-
MP3 downloads aren't competitive? I would have thought they were quite competitive and there being fairly cutthroat competition.
-
concentrate on your on-page SEO, try to get UGC on your website, since it should match well with the industry you are in and in few months you will see a difference!
-
According to you what should i do.??
-
When you in a such low volume or non competitive industry, is important to know that Google may allow someone with low quality even someone using grey hat SEO to rank just because that someone maybe better than the rest..
-
Thanks for your reply. How can i contact you as i can't post urls here.? (New to moz)
-
Yes i am in non competitive industy( mp3 download).
-
I reported them but nothing happens (i reported a week ago)
-
having gone that road, never mess with black hat. Keep it nice and clean, work on your website, improve your on-page optimization, report them to be on the safe side, and wait...
the worse thing someone could do is fight fire with fire. Let them burn and stay clear. The only issue is if you are in a non competitive industry.. Is that the case?
-
My guess is there are probably other factors causing their ranking over you. Keyword stuffing in divs is so 2001,,,
You can report them as RangeMaketing suggested, sure. I would review other on-page areas to see if there are ways you can improve on the basics. I would also see if there are any links passing link equity that you can acquire that match their profile. If your page is better across the board for SEO than theirs, you have more PA/DA, etc., you may want to consider getting some co-citations to appear as an equal to them PLUS your better on-page metrics.
Hope this helps.
-
Best you can do is report the website for 'cloaking' via Webmaster Tools:
-
yes it is working for them but i don't want google to penalize my site after some time.
-
Is it working for them? If so, see if it works for you.
There is no white hat/black hat. There are only differing levels of risk tolerance. And risky SEO does pay off when done correctly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url shows up in "Inurl' but not when using time parameters
Hey everybody, I have been testing the Inurl: feature of Google to try and gauge how long ago Google indexed our page. SO, this brings my question. If we run inurl:https://mysite.com all of our domains show up. If we run inurl:https://mysite.com/specialpage the domain shows up as being indexed If I use the "&as_qdr=y15" string to the URL, https://mysite.com/specialpage does not show up. Does anybody have any experience with this? Also on the same note when I look at how many pages Google has indexed it is about half of the pages we see on our backend/sitemap. Any thoughts would be appreciated. TY!
On-Page Optimization | | HashtagHustler1 -
When To And When To Not Use AMP
Hi All, I hope someone might be able to help me with this. I am looking at implementing AMP on a website. I have been doing a stack of research into advantages of using AMP etc. However I am struggling to find an answer to when I should not / where not to be implementing AMP. More specifically I see large sites that are running AMP but it does not seem to be on all pages, how is the best way to determine when or when not I should be using AMP. Thanks In Advance, Mark
On-Page Optimization | | amevamark0 -
If I am using Lazy Load & Ajax Technology then how "tools.pingdom.com" will consider website performance?
Hello Experts, If I am using Lazy Load & Ajax Technology then how "tools.pingdom.com" will consider website performance? I am not using this technology but few of my competitors are using this technology but still there performance in pingdom tool worst than my ecommerce site Little bit confuse please help. Thanks! Wrights
On-Page Optimization | | wright3350 -
Advantage of using LocalBusiness rich snippets?
I am working on a website that has company profile pages. What are the advantages of using LocalBusiness rich snippets for profiles?
On-Page Optimization | | calf0 -
ECommerce & Reviews when generated by 3rd Party uses Javascript
Hi all, I am trying to optimize our product pages and I know one of the important factors is showing customer reviews. While we have plenty of reviews to show they are collected by a third party (Shopper Approved) and the way we have been told to display them on our pages is via a Javascript. My question is, is this sufficient for search engines to be able to crawl and interpret the Javascript or are we missing out on user generated content since it is displayed via Javascript. If so are there best practices or recommendations to help us? Thank you!
On-Page Optimization | | MyFairyTaleBooks
Dinesh
http://www.MyFairyTaleBooks.com <- this is the site in question if it helps.2 -
Do you think using accordion text can hurt SEO?
I have a lot of text for my home page. My plan is to a J Query Plugin for accordion text. Does anyone think that this can hurt SEO efforts?
On-Page Optimization | | DTOSI1 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Should I use this Facebook comment content on my related blog post?
I have a blog post that ranks pretty high for the term "justin bieber tickets". We are running a ticket giveaway and have received tons of responses on Facebook and G+. The responses are often poorly written in they sense that they are from younger fans, but it is a bunch of related content that I thought could be a "good "addition of unique content to the post. Is this a good idea in general? Is it still a good idea if the comments are poorly written and contain lots of slang an exclamation points? Is it bad form to put people's Facebook comments live on the web, even though it is a public page. Here is the post Example of what this would look like in the post >http://cl.ly/1Q3N0t091V0w3m2r442G Source of comments >http://www.facebook.com/SeatGeek Another less aggressive option would be to curate some of my favorite comments... Thanks for any thoughts.
On-Page Optimization | | chadburgess0