Help!!! Am I being Attacked???
-
Hello,
I do not believe so much in spammy links attacks and I definitely do not believe my site is worth attacking.
However, I'm seeing new links pointing to my site that I have no idea where they come from.
I just spotted three articles on a poor crappy article site with exact match keywords point to me. The articles are completely unique (copyscaped them) and they were posted according to the site time stamp during Oct and Nov 2012. (And they Appear in the WMT recently discovered links from more or less the same time).
What to do (besides for disavowing this domain)?
Thanks
-
I actually agree with what Mark Ginsberg said that it might be an SEO firm we hired way in the past (we don't have anyone for over 6 months) and might have pipe lined articles on our behalf.
But yes, main keywords to the exact landing pages
-
are they your main keyword phrase? not sure why someone would go thru the trouble to make them unique if they wanted to sabotage you. have u tried contacting the author?
-
Thanks, what you are saying makes sense.
(even though we haven't been using any SEO firm for many months now). -
Have you outsourced link building / SEO services to anyone? It could be they used a tool or outsourced this work to someone else, and these articles only went live in Oct. and Nov, even though they had technically gone through the pipeline a few months prioer?
It doesn't seem like someone would attack your site in that manner with a few articles on a crappy site - they would use sitewides, thousands of directory submissions, social bookmarks, etc., for much cheaper than having 3 unique articles written and posted with anchor text.
I'm more of the opinion these are remnants of an old link building strategy than of a malicious attack to hurt your site.
Mark
-
Yes these links are the only ones. It is really strange...
-
In those articles the backlinks to your website are the only ones?
If there are also other backlinks maybe the one posting the article put those links to your site without any intention to cause you harm, maybe he just wanted to add other backlinks too, to make the article look more natural.
I'm not a big fan of the Disavow Links Tool. In the official release Google said this tool should be used just in case you receive the Unnatural links warning. My advice is to always think twice before using this tool.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Complicated Title Tag Issues. Experts, Please Help!
Hey there Moz community! This is the first time I ask a question here so please forgive me if I miss any forum etiquette. I am managing SEO for an educational site which is built in React Js, and so far much of the job has been keyword research and site optimization. The site still has slow PageSpeed though. The Issues - 4 weeks ago we published 20 or so content pieces, I had pre-prepared title tags and meta descriptions. But when we released the content there was a programming error that made all of the pages show another title tag for all 20 pages instead of the pre-prepared individual title tags. I noticed this after 3 days and the issue was fixed within 6 days, but by then Google had crawled and indexed the pages. And now I can't get Google to change to the pre-prepared tags no matter what I do! I've tried changing the content, changing the URL of one of the pages, and I've sent Google spiders to re-crawl the pages multiple times. The super weird thing is that the correct title tag shows in the 'navigation bar/tabs bar' on google chrome: But NOT when I view the source code for the page: Yesterday I was taking a walk in the park and I just couldn't stop thinking about it (it is really starting to get to me by now since nothing works), so I ran back home and looked closely at one of these pages in the Google search console. And I noticed something I hadn't seen before… BOTH of the title tags can be found in the HTML: Pre-prepared title tag: <title></strong>UK Seat Belt & Car Seat Laws: The Definitive Guide<strong></title> The other title tag (in src section): title=Ace%20The%20DMV%20Permit%20Test%20%26%20Get%20Your%20License Could this be the problem or what do you think? I've understood that Google has automated title tags and that they can choose their own if they think it fits better, but the title tags aren't even close to describing the topic as it is now so it doesn’t make any sense. All answers are greatly appreciated! Your advice is life-saving for a learner like me. P.S. I love SEO but it can be very frustrating sometimes! Thank you very much, Leo
Intermediate & Advanced SEO | | Leowa0 -
Site under attack from Android SEO bots - expert help needed
For last 25 days, we are facing a weird attack on our site. We are getting 10x the normal mobile traffic - all from Android, searching for our name specifically. We are sure that this is not authentic traffic as the traffic is coming from Organic searches and bouncing off. Initially, we thought this was a DDoS attack, but that does not seem to be the case. It looks like someone is trying to damage our Google reputation by performing too many searches and bouncing off. Has any one else faced a similar issue before? What can be done to mitigate the impact on site. (FYI - we get ~2M visits month on month, 80% from Google organic searches). Any help would be highly appreciated.
Intermediate & Advanced SEO | | KJ_AV0 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Help FORUM ( User generated content ) SEO best practices
Hello Moz folks ! For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC. I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated. Best, Yan
Intermediate & Advanced SEO | | ydesjardins2000 -
Clean URL help!
Hi all, In short, i'm looking to redirect examplepage.html to examplepage .I've got rid of the .html, sitewide this morning. However I want to redirect Google & people who have bookmarked the old url structure. Currently if you have the extension on or not, it will show in your browser. I'm wanting /examplepage.html to 301 redirect to /examplepage I've gone the normal way I'd do it by adding in .htaccess: Redirect 301 /examplepage.html http://www.example.com/examplepage I'm assuming it isn't redirecting as the example.html page is no longer... what is the way around this? Thanks for any help! In firefox the error of the page is: The page isn't redirecting properly Firefox has detected that the server is redirecting the request for this address in a way that will never complete.
Intermediate & Advanced SEO | | Whittie0 -
Help! My Domain Authority keeps dropping! What do I do?
Hey! I just noticed my Domain Authority keeps dropping? What's happening? What do I do to get it better. I'm scared and dont know the next move to make to get this site better. Help please! Thanks! http://www.moondoggieinc.com Kristy O
Intermediate & Advanced SEO | | KristyO1 -
International IP redirection - help please!
Hi, We have a new client who has built a brand in the UK on a xyz.com domain. The "xyz.com" is now a brand and features on all marketing. Lots of SEO work has taken place and the UK site has good rankings and traffic. They have now expanded to the US and with offline marketing leading the way, xyz.com is the brand being pushed in the US. So with the launch of the offline marketing US IP's are now redirected to a US version of the site (subfolder) with relevant pricing and messaging. This is great for users, but with Googlebot being on a US IP it is also being redirected and the UK pages have now dropped out of the index. The solution we need would ideally have both UK and US users searching for xyz.com, but would see them land on respective static pages with correct prices. Ideally no link authority would be moved via redirection of users. We have considered the following solutions Move UK site to subfolder /uk and redirect UK ips to this subfolder (and so not googlebot) downside of this is it will massively impact the UK rankings which are the core driver of the business - also would this be deemed as illegal cloaking? natural links will always be to the xyz.com page and so longer term the US homepage will gain authority and UK homepage will be more reliant on artificial linkbuilding. Use a overlay that detects IP address and requests users to select relevant country (and cookies to redirect on second visit) this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences. Use a homepage with country selection (and cookies to redirect on second visit) this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences. Is there an easy solution to this problem that we're overlooking? Is there another way of legal cloaking we could use here? Many thanks in advance for any help here
Intermediate & Advanced SEO | | Red_Mud_Rookie0 -
Does a 'Certified Domain' help SEO?
I see that GoDaddy offer a 'Certified Domain' option. Does this help SEO at all?
Intermediate & Advanced SEO | | Techboy0