Black Seo --> Attack
-
Hello there,
Happy new year for everyone, and good luck this year.
I have a real problem here, I saw in MOZ link history that somehow the "Total Linking Root Domains" is growing from a medium of 30 - 40 to 240 - 340 links and keep it growing. I guess somebody make me good joke, cause i did not buy any link :)) even cn, brasil, jp links, my store is from Romania.
How I can block these links I think google will make me bad instead. What should i do?
Thank you so much.
With respect,
Andrei -
Hello all, and thank you for answers.
I disavow the links (500), but no effects, still. 3 Months passed. Nothing, I am still "banned", if i can say so.
Anyway for unserious players (Seo), this will become a precedent situation. So if you want to "kick" the competition just buy them 500 - 1000 links, and you should be 1st position. Google must pay attention to this kinds of shits. Really now :))
With respect,
Andrei -
If you had no part in making them Google says you should be fine to ignore them. But, if you want to be sure, it's not a bad idea to comb through them and disavow on the domain level.
There is info in here that may help:
http://moz.com/blog/preparing-for-negative-seo
http://moz.com/blog/guide-to-googles-disavow-tool
You may also want to have a good look for malware if you are seeing a sudden increase in links to your site. Sometimes a Google search for site:yoursite.com viagra | cialis | loans can help.
-
'Neatza Andrei,
First of all, try to monitor new backlinks for the website, where are they coming from, what type of links are coming in to your website. Are all of these links low quality?
I'd use Majestic for this problem (they identify new links way faster than Moz), so you can export raw data, analyize and create a disavow file for the low quality links.
Gr., Keszi
-
Hi Andrei
You should definitely disavow these links if you suspect them to be bad or dodgy links. A very substantial guide on how to do so can be found here http://moz.com/blog/guide-to-googles-disavow-tool
This will help you avoid getting penalised by google.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO - Spammy Backlinks By Competitor
Hi Everyone, Someone has generated more than 22k spam backlinks (on bad keywords) for my domain.Will it hurt on my website (SEO Ranking)? Because it is already in the top ranking. How could I remove all the spammy backlinks? How could I know particular competitior who have done this?
White Hat / Black Hat SEO | | HuptechWebseo0 -
White H1 Tag Hurting SEO?
Hi, We're having an issue with a client not wanting the H1 tag to display on their site and using an image of their logo instead. We made the H1 tag white (did not deliberately hide with CSS) and i just read an article where this is considered black hat SEO. https://www.websitemagazine.com/blog/16-faqs-of-seo The only reason we want to hide it is because it looks redundant appearing there along with the brand name logo. Does anyone have any suggestions? Would putting the brand logo image inside of an H1 tag be ok? Thanks for the help
White Hat / Black Hat SEO | | AliMac261 -
PDF Sharing sites - scribd/dropbox/edocr/etc Cleaning Up SEO History
Howdy, Whilst in the process of cleaning up a new clients seo profile and have encountered a lot of techniques I am uncomfortable with and in my opinion should be removed. One technique I have not seen before is using a load of pdf sharing and video sites. The domains have high DA ratings, but to me the intention is highly questionable. The sites include: https://www.dropbox.com/s/tuxb8w1qowcm27i/Looking for boiler spares-geniune parts and consumables.pdf?dl=0 http://www.scribd.com/doc/241542076/Looking-for-Boiler-Spares-geniune-Parts-and-Consumables http://www.divshare.com/download/26207602-569 And so the list goes on for about 50 domains. Am I correct to be concerned here and what was the seo plan here? Thanks in advance. Andy Southall. (Marz Ventures)
White Hat / Black Hat SEO | | MarzVentures0 -
By changing the wordpress theme what need to take for seo consideration?
Hi guys! we have a site that been using a theme for a year now and we decided to change to a new one, the question here is, does it affect seo? or it is possible to remain 100% for the seo? What caution tips that you guys can share for changing the theme? Does just remaining the same URL works?
White Hat / Black Hat SEO | | andrewwatson922 -
Best Link Building Strategies in Modern SEO
Hello, In light of all the updates and also in guest blogging being only for nofollow links now, what's some of the best strategies for link building for ecommerce sites? We're in an industry where the content doesn't get linked to very much. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Local SEO
My website was ranked on top of all keywords on Google Local results but in january 2013 it has droped to 2nd page on all keywords. After my checking i had found keyword staffing on place page which i have removed two months back. During last two months site was on 2nd page but now it is dropping more. Website is listed on all major local directories. What is the best way to handle such website case for local seo to get back rankings ??? what should be our new plan or strategy ??? I will be very thankful to all of you for best suggestion.
White Hat / Black Hat SEO | | mnkpso0 -
Search query for SEO Brisbane
Would love to get some opinions on the latest Penguin 2.0 update and how on earth the #1 rank is #1 ranked, very, very peculiar... http://www.google.com/search?gs_rn=14&gs_ri=psy-ab&pq=sila&cp=8&gs_id=10&xhr=t&q=seo+brisbane&pf=p&client=safari&rls=en&sclient=psy-ab&oq=seo+bris&gs_l=&pbx=1&bav=on.2,or.r_qf.&bvm=bv.47008514,d.aGc&biw=1300&bih=569 Any and all theories welcomed and appreciated. Thanks, Mike
White Hat / Black Hat SEO | | MichaelYork0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0