My competitors are using blackhat. What should i do.?
-
My competitors are using on page black hat methods
They are using like
keyword stuffing
What should i do.?
-
-
MP3 downloads aren't competitive? I would have thought they were quite competitive and there being fairly cutthroat competition.
-
concentrate on your on-page SEO, try to get UGC on your website, since it should match well with the industry you are in and in few months you will see a difference!
-
According to you what should i do.??
-
When you in a such low volume or non competitive industry, is important to know that Google may allow someone with low quality even someone using grey hat SEO to rank just because that someone maybe better than the rest..
-
Thanks for your reply. How can i contact you as i can't post urls here.? (New to moz)
-
Yes i am in non competitive industy( mp3 download).
-
I reported them but nothing happens (i reported a week ago)
-
having gone that road, never mess with black hat. Keep it nice and clean, work on your website, improve your on-page optimization, report them to be on the safe side, and wait...
the worse thing someone could do is fight fire with fire. Let them burn and stay clear. The only issue is if you are in a non competitive industry.. Is that the case?
-
My guess is there are probably other factors causing their ranking over you. Keyword stuffing in divs is so 2001,,,
You can report them as RangeMaketing suggested, sure. I would review other on-page areas to see if there are ways you can improve on the basics. I would also see if there are any links passing link equity that you can acquire that match their profile. If your page is better across the board for SEO than theirs, you have more PA/DA, etc., you may want to consider getting some co-citations to appear as an equal to them PLUS your better on-page metrics.
Hope this helps.
-
Best you can do is report the website for 'cloaking' via Webmaster Tools:
-
yes it is working for them but i don't want google to penalize my site after some time.
-
Is it working for them? If so, see if it works for you.
There is no white hat/black hat. There are only differing levels of risk tolerance. And risky SEO does pay off when done correctly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use nofollow to limit the number of links on a page?
My website is an ecommerce and we have on homepage about 470 links ! 1. We have a top bar with my account, login, faq, home, contact us and link to a content page. 2 . Then we have multistore selection 3. Then we have the departament menu, with several parants + child category links 4. Then we have a banner 5. Then we have a list of the recently sold and new products. 6. then we have an image grid with the most important cms/content pages (like faq, about us, etc) 7. then we have footer, with all info pages, contact us, about us, my account etc. There are some links that are repeted 2, 3 times. For a user it is easier to find the informations but I'm not sure how search bots (google) deal with that. So I was thinking on how can I have around 150 links to be followed. To remove the links from the page is not possible. What about to add nofollow to repeted links and some child category, as the spider will crawl the father and will access child on the next page? Is this a good strategy?
On-Page Optimization | | qgairsoft0 -
Can you use multiple keywords for on page for ranking?
I understand using a keyword (or phrase) and correctly building that into the site structure (URL, Title Tag, body, etc). So, this question is going to be elementary, but I am starting to question myself as I write content. I have a client, for example, that has a new site and a page for Chocolate cakes. Now the other pages they built out are for Cheesecakes, Cupcakes, etc. So we optimized the Chocolate cakes page with our keyword throughout (Getting an A+ on page content grade). But now they are asking me why they can't be found for chocolate eclairs, chocolate fudge cake, devils chocolate cake, double chocolate cake, etc. My first quick answer is that they should build more pages. But am I doing this wrong?
On-Page Optimization | | cschwartzel0 -
Using Robots Meta Tag on Review Form Pages
I have gone over this so many times and I just can't seem to get it straight and hope someone can help me out with a couple of questions: Right now, on my dynamically created pages created by filters (located on the category pages) I am using rel""canonical" to point them to their respective category page. Should I also use the robots meta tag as well? Similarly, each product I have on my site has a review form on it and thus is getting indexed by Google. I have placed the same canonical tag on them as well pointing them to the page with the review form on it. In the past I used robots.txt to block google from the review pages but this didn't really do much. Should I be using the robots meta tag on these pages as well? If I used the robots meta tag should I noindex,nofollow? Thanks in advance, Jake
On-Page Optimization | | jake3720 -
Multi-Site SEO using Host Headers
Hi - I'm working on a proposal for a client who runs 3 different career websites. He uses "host headers" to direct the website visitor to the correct website. For example, if the visitor comes from Washington he goes to one url. If he is in Kansas, he goes to another URL. Does anyone have any experience doing SEO with this type of system? What do I need to know? What are hurdles I'll encounter? Thanks, -Hunter
On-Page Optimization | | HunterW0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
What URL Should I use in Google Place Page?
Alright, I have a client that has 1 website and 14 locations. We want to create place pages for each of their locations but my question is which URL should I put in the place page and why? I can put in the root domain into each place page, or should I put in the URL that lands on the actual location on the root. example: domain.com/location1 Thanks!
On-Page Optimization | | tcseopro0 -
Negatives to using custom sub domain?
So - being photographers, we have our main website, but also, we use a hosted service for all our client galleries (www.zenfolio.com) So, in effect, we have two websites: Our main informational website Our client gallery/proofing website The client gallery has back links to our main website - so, when people are viewing their gallery, they can easily get back to our main site. We also have thrown a few of our preferred keywords in there for SEO purposes. The gallery has thousands of pages which link back to the main site. So.. the client gallery URL can either be: http://ourbusinessname.zenfolio.com OR we can have it so it uses our own domain, such as: http://gallery.ourbusinessname.com The question is, which domain name will benefit the back links more? Our custom subdomain (which links to our main domain) or, using the Zenfolio domain (which is external to our site). Or, is there no real difference either way? Or.. do I make no sense?
On-Page Optimization | | blitzna100 -
Using new domain in existing website
We have already a site and we want to make business in another country. We don't want to make a copy of our site because it requires more resources, but we want to use a different domain because the business will be run by a partner in that country. The idea is to make a folder in our site, www.mysite.com/countryname/ and associate a new domain www.newdomain.com, so when users go to the new domain they see the content under www.mysite.com/country/. Since de www.newdomain.com will use DNS redirection, the current domain won't be seen. Is this correct from a SEO perspective? Thanks!
On-Page Optimization | | Xopie1