Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
-
Hi,
I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following:
-
I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console.
-
I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory.
-
I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user.
I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows:
jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
-
-
I know I was penalized because search console recorded an overnight drop in impressions/clicks of 75%. Never was a problem before. I didn't need to run Screaming Frog or anything to find links that explain the problem. I just browsed some of the user generated content and then once I realized that Google was crawling javascript generated links it all made sense.
Some of the sites people were liking to included: CheaterLand.com, PredatorsWatch.com, DirtyHomeWreckers.com, EscortBabylon.net, SugarDaddyforMe.com, GFEMonkey.com, EscortBabylon.com, CityXGuide.com, and AdultLook.com.
I think as long as Google respects the robots.txt directives I should be just fine. The redirect page itself is blocked by robots.txt so even if Google finds one of those URLs (ex: https://cyberbullyingreport.com/redirect/?url=https://moz.com/community/q/changing-links-to-spans-with-robots-txt-blocked-redirects-using-linkify-jquery) it shouldn't even follow the 302 since the robots.txt file (https://cyberbullyingreport.com/robots.txt) disallows that directory.
-
301 or 302 redirecting well if you’re going to do one you should do 302 but it’s not going to help you much unless you’re going to send the URL to a 410 when something is no followed it is the same thing as what robots.txt will do.
https://support.google.com/webmasters/forum/AAAA2Jdx3sUEbHp0yjgT6c?hl=sv
do you have a report from Google knowing that you have been penalized?
is there anyway you could run Screaming Frog and show some of these URLs that you’re talking about?
respectfully,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
Drastic surge of link spam in Webmaster Tools' Link Profile
Hello all I am trying to get some insights/advice on a recent as well as drastic increase in link spam within my Webmaster Tools' Link Profile. Before I get into more detail, I would like to point out, that I did find some relevant MOZ community posts addressing this type of issue. However, my link spam situation may have to be approached from a different angle, as it concerns two sites at the same time and somewhat in the same way. Basically, starting in July 2017, from one day to the other, a multitude of domains (50+) is generating link spam (at least 200 links a month and counting) and to cut a long story short, I believe the sites are hacked. This is because most of the domain names sound legit and load the homepage, but all the sub-pages linking to my site contain "adult" gibberish. In addition, it is interesting to see, that each sub-page follows the same pattern, scraping content from my homepage including the on-page links - that generate the spammy backlinks to my sites - while inserting the adult gibberish in between (basically it's all just text and looks like as if a bot is at work). Therefore, it's not like my link is being inserted "specifically" into pages or to spam me with the same anchor text over and over. So, I am not sure what kind of link spam this really is (or the purpose of it). Some more background information: As mentioned above, this link spam (attack?) is affecting two of my sites and it started off pretty much simultaneously (in addition, the sites focus on a competitive niche). The interesting detail is, that one site suffered a manual penalty years ago, which has been lifted (a disavowal file exists and no further link building campaigns have been undertaken after the cleanup), while the other site has never seen any link building efforts - it is clean, yet the same type of spam is flooding that websites' link profile too. In the webmaster forums the overall opinion is, that Google ignores web spam. All well. However, I am still concerned, that the dozens of spammy links pointing to the website "with a history" may pose a risk (more spam on a daily basis on both sites though). At the same time I wonder, why the other "clean" site is facing the same issue. The clean sites' rankings do not appear to be impacted, while the other website has seen some drops, but I am still observing the situation. Therefore, should I be concerned for both sites or even start an endless disavowal campaign on the site with a history? PS: This MOZ article appears to advice so: https://moz.com/blog/do-we-still-need-to-disavow-penguin "In most cases, sites that have a history of collecting unnatural links tend to continue to collect them. If this is the case for you, then it’s best to disavow those on a regular basis (either monthly or quarterly) so that you can avoid getting another manual action." What is your opinion? Sorry for the long post and many thanks in advance for any help/insight.
White Hat / Black Hat SEO | | Hermski0 -
What to do with these toxic links?
Back in July I had posted here that I thought someone was doing negative SEO against us. We monitor our links on a daily basis, and a lot of toxic links came in quickly within a few days. So we were pro-active and ended up disavowing those links soon after we saw them. Shortly after that our ranking start to drop and we lost a good amount of traffic, though I do not know if its really connected since we only disavowed those toxic links and we weren't ranking FROM those links since they were disavowed so quickly. Now, its happening again. 20 new inbound domains linking to us from complete crap websites with crap content and not done by us. I want to disavow them, but I am thinking that maybe the first time we disavowed the links, it hurt us, and maybe disavowing now will hurt us further? I think Google should be able to filter out this crap but who knows, too much depends on this being handled correctly. Here are some of the crappy links: http://optibike.com/?home.php=page/loans/student-loan-without-a-cosigner-2.html
White Hat / Black Hat SEO | | DemiGR
http://designsbynickthegeek.com/?index.php=finance/loans/loan-for-you-3.html
http://www.nuvivaweightloss.com/?index.php=article/loans/300-loan-today.html
http://ecommercesalesmultipliersystem.com/?home.php=board/loans/fast-loan-with-monthly-payments-2.html They are mostly duplicate content across a network of sites. How would you guys handle this?0 -
Redirecting location-specific domains
I am working on a project for a physician who only cares about reaching patients within a specific geographic region. He has a new technique at his practice and wants to get the word out via radio spots. I want to track the effectiveness of the radio campaigns without the use of call-tracking numbers or special promo codes. Since the physician's primary domain is very long (but well-established), my thought is to register 3-4 short domains referencing the technique and location so they would be easy for listeners to remember and type-in later. 301 these domains to the relevant landing page on the main domain. As an alternative. Each domain could be a single relevant landing page with a link to the relevant procedure on the main site. It's not as if there is anything deceptive going on, rather, I would simply be using a domain in place of a call tracking number. I think I should be able to view the type-in traffic in Analytics, but would Google have an issue with this? Thoughts and suggestions appreciated!
White Hat / Black Hat SEO | | SCW0 -
Is it okay to use eLocal services?
Is it okay to use a service like eLocal's 'reach the web' to clean up our company listings on website directories or is it considered black hat? Our company name and address is inconsistent on many of the website directories and we want to clean it up fast. eLocal has a service that can do this. I just want to make sure it's not considered bad to have a vendor do it. Thanks!
White Hat / Black Hat SEO | | KristyFord0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
White Hat / Black Hat SEO | | THB0