Help!!! Am I being Attacked???
-
Hello,
I do not believe so much in spammy links attacks and I definitely do not believe my site is worth attacking.
However, I'm seeing new links pointing to my site that I have no idea where they come from.
I just spotted three articles on a poor crappy article site with exact match keywords point to me. The articles are completely unique (copyscaped them) and they were posted according to the site time stamp during Oct and Nov 2012. (And they Appear in the WMT recently discovered links from more or less the same time).
What to do (besides for disavowing this domain)?
Thanks
-
I actually agree with what Mark Ginsberg said that it might be an SEO firm we hired way in the past (we don't have anyone for over 6 months) and might have pipe lined articles on our behalf.
But yes, main keywords to the exact landing pages
-
are they your main keyword phrase? not sure why someone would go thru the trouble to make them unique if they wanted to sabotage you. have u tried contacting the author?
-
Thanks, what you are saying makes sense.
(even though we haven't been using any SEO firm for many months now). -
Have you outsourced link building / SEO services to anyone? It could be they used a tool or outsourced this work to someone else, and these articles only went live in Oct. and Nov, even though they had technically gone through the pipeline a few months prioer?
It doesn't seem like someone would attack your site in that manner with a few articles on a crappy site - they would use sitewides, thousands of directory submissions, social bookmarks, etc., for much cheaper than having 3 unique articles written and posted with anchor text.
I'm more of the opinion these are remnants of an old link building strategy than of a malicious attack to hurt your site.
Mark
-
Yes these links are the only ones. It is really strange...
-
In those articles the backlinks to your website are the only ones?
If there are also other backlinks maybe the one posting the article put those links to your site without any intention to cause you harm, maybe he just wanted to add other backlinks too, to make the article look more natural.
I'm not a big fan of the Disavow Links Tool. In the official release Google said this tool should be used just in case you receive the Unnatural links warning. My advice is to always think twice before using this tool.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Manual Penalty Reconsideration Request Help
Hi All, I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty. So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups; Domains that I'm keeping (good quality, natural links). Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing). Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.). One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request? My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary. So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live? All responses, thoughts and ideas are appreciated 🙂 Kind Regards Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Canonical Help (this is a nightmare)
Hi, We're new to SEO and trying to fix our domain canonical issue. A while back we were misusing the "link canonical" tag such that Google was tracking params (e.g. session ids, tagging ) all as different unique urls. This created a nightmare as now Google thinks there's millions of pages associated with our domain when the reality is really a couple thousand unique links. Since then, we've tried to fix this by: 1) specifying params to ignore via SEO webmasters 2) properly using the canonical tag. However, I'm still recognizing there's a bunch of outsanding search results that resulted from this mess. Any idea on expectation on when we'd see this cleaned up? I'm also recognizing that google is looking at http://domain.com and https://domain.com as 2 different pages even though we specify to only look at "http://domain.com" via the link canonical tag. Again, is this just a matter of waiting for Google to update its results? We submitted a site map but it seems like it's taking forever for the results of our site to clear up... Any help or insight would greatly be appreciated!
Intermediate & Advanced SEO | | sfgmedia0 -
Changing my pages URL name - HELP NEEDED FAST
Hello, I need to change the URL name for a few pages on my site. The site was launched just recently, so it has no obvious ranking and traffic. My question is, what is the best practice for changing/deleting the page name? after deleting the page, should I go to Google webmaster tool and use URL Removal and remove the old page? I know that I have to also create a new XML sitemap file, but not sure about the old pages in google search result Thanks!
Intermediate & Advanced SEO | | mdmoz0 -
How can I improve my rankings in Google with help of seomoz
Hey guys, I have joined seomoz today and set up campaigns for my sites. I got reports about keyword rankings, errors, notices etc. But I am still confused about how to use seomoz in order to improve my rankings. My point is does seomoz provides any services for improving position in google or simply seomoz provides only reporting? These reports are good but my ultimate goal to join seomoz is to improve my rankings for my each website and each post. Please help. BJ
Intermediate & Advanced SEO | | intmktcom0 -
DMOZ help
So yesterday I got a DMOZ editor account. I would like to know if Google indexes the editor profile pages on DMOZ: http://www.dmoz.org/public/profile?editor= here are some examples http://www.dmoz.org/public/profile?editor=thehelper http://www.dmoz.org/public/profile?editor=raph3988 http://www.dmoz.org/public/profile?editor=skasselea I would like to know if it is worth while to build up this page so it will pass link juice. And can anyone tell me how frequently Google crawls for new editors (if that's possible?)
Intermediate & Advanced SEO | | raph39880 -
Anchor text help
Hello, I am a small website designer in Mexico.. as you know it is hard to rank in these keywords My main competitors appear with (diseño web = web design) on google.com.mx Almost 80% of the anchor text from my links are (diseño web Mexico =web design Mexico) If I search (diseño web Mexico =web design) on google.com.mx I appear on first page. If I search (diseño web = web design) on google.com.mx I appear on page 3. My questions: Is this because my anchor text is diseño web Mexico? should i change it to (diseño web = web design)? Or is it because (diseño web = web design) is a harder key word to rank? this is my website http://bit.ly/eKyWvr Regards
Intermediate & Advanced SEO | | Pixelar0 -
Need help identifying why my content rich site was hurt by the Panda Update.
Hi - I run a hobby related niche new / article / resource site (http://tinyurl.com/4eavaj4) which has been heavily impacted by the Panda update. Honestly I have no idea why my Google rankings dropped off. I've hired 2 different SEO experts to look into it and no one has been able to figure it out. My link profile is totally white hat and stronger then the majority of my competitors, I have 4000+ or so pages of unique, high quality content, am a Google News source, and publish about 5 new unique articles every day. I ended up deleting a 100 or so thin video pages on my site, did some url reorganization (using 301s), and fixed all the broken links. That appeared to be helping as my traffic was returning to normal. Then the bottom dropped out again. Since Saturday my daily traffic has dropped by 50%. I am really baffled at this point as to what to do so any help would be sincerely appreciated. Thanks, Mike jamescrayton2003@yahoo.com
Intermediate & Advanced SEO | | MikeATL0