Cutting off the bad link juice
-
Hello,
I have noticed that there is plenty of old low quality links linking to many of the landing pages. I would like to cut them off and start again. Would it be ok to do the following?:
1. create new URLs (domain is quite string and new pages are ranking good and better than the affected old landing pages) and add the old content there
2. 302 redirect old landing pages to the new ones
3. put "no index" tag on the old URLs (maybe even "no index no follow"?)or it wouldn't work?
Thanks in advance
-
Hello all,
Thank you for your answers,
Oleg, I am not that keen on meta refresh, as it is poor user experience - apparently it needs to be about 10 sec, as shorter time G. may treat as 301. Wonder what is the shortest time I can use which will lose the link juice but wouldn't disturb my visitors.
Gagan, in regards to 301 redirecting the bad page to 404 page..isn't that easier just to make it 404 without redirect?
Mike, what do you think is the best solution to keep the traffic but cut off bad links to specific landing pages.
I will be testing 302 soon from old URL to new one. Wonder if I ALSO should put 404 on the old one...or maybe no index...or it doesn't matter? What are your thoughts?
-
Does it seems perfectly okay to make the site page (linked by spam links) to have 301 redirect to show 404 error page
As if its a CMS system where many other pages are linked through other subcategories too of the component, so the option of cutting down the bad page, which is hurt by low quality links is through 301 redirect to land to 404 error page. Will it diminish or rather make completely off the value of all spam links pointing to it and finally does not affect the site at all.
-
Upon further research, you are correct. A noindexed page is still crawled and indexed, just not in SERPs. So any links will still be followed and the page is still a part of the website. With this in mind, I think you should 404 the page and redirect via meta refresh after some time. Reach out to the webmaster's of the good links and ask them to change the new URL.
I still don't think a 302 is the way to go in this scenario. Ideally, you'd experiment with different options and see which produces the best results.
-
Personally I would go with Oleg's original suggestion: "If your rankings are being hurt by these links, I would move them to a new URL and 404 the old page. I would then go through the link profile for the old URLs. Find all the high quality links and contact the webmasters asking to change it to the new URLs."
-
Sure, But Oleg said, "If you noindex the page, G won't be able to access it and it will lose all its authority".
If in case the page loses all its authority - does it still will pass on the negative value to the domain or other pages due to low authority or spam backlinks pointing to it
If its true, then may be making the page cut off from site by marking it 404 is a better way !!
-
NoIndex won't cut the links. It will just remove the page from the SERPs. So you'll still be hit with the bad links to your site and organic traffic will be cut off.
-
Sure, thanks
Does it mean if we noindex it - can it be safely presumed that all the low quality links pointing to that url will be nullified and it will not have any negative effect to the site. I mean there wont be any need for making the page 404, if we still use that page as regular part of the site, like for filling forms etc.
Many thanks, once again for your detailed reply
-
So his goal is the have users redirect to the new page without having Google pass the link authority to the new URL.
If you noindex the page, G won't be able to access it and it will lose all its authority. But any user that visits the page will still be redirected to the new url. There is no such thing as a 404 redirect.
Meta refresh is another way to redirect users to a new page without passing authority. As long as the time is greater than 0 (meta refresh of time=0 is treated similar to a 301), it shouldn't pass authority. So same deal, noindex the page and set up a redirect for users, not bots.
-
Hello Oleg,
Am also interested in knowing more about it
Does marking a noindex, follow or noindex, nofollow to that page is a better way than 404 redirect ?
Also, i dint get you for meta refresh redirect. What does it mean like ?
-
302 by definition is "Temporary Redirect", which is not applicable here. According to this 302 experiment, 302's did actually pass some authority down (which may or may not hurt you). I do see the UX advantage to having the old URL redirect to the new page though.
Another alternative is to block the page via robots and set up a redirect or noindex the page and set a timed meta refresh redirect to the new page.
-
Thank you Oleg,
I have checked and have a few .gov.uk links going to some of those pages which generates some traffic, so not sure if 404 on them is the suitable in the situation.
On the other hand why 404 is better than 302? They both stop link juice passing but 302 passes the traffic.
-
If your rankings are being hurt by these links, I would move them to a new URL and 404 the old page. I would then go through the link profile for the old URLs. Find all the high quality links and contact the webmasters asking to change it to the new URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
How should I use the 2nd link if a site allows 2 in the body of a guest post?
I've been doing some guest posting, and some sites allow one link, others allow more. I'm worried I might be getting too many guest posts with multiple links. I'd appreciate your thoughts on the following: 1. If there are 50+ guest posts going to my website (posted over the span of several months), each with 2 links pointing back only to my site is that too much of a pattern? How would you use the 2nd link in a guest post if not to link to your own site? 2. Does linking to .edu or .gov in the guest post make the post more valuable in terms of SEO? Some people recommend using the 2nd link to do this. Thanks!
White Hat / Black Hat SEO | | pbhatt0 -
Where Is This Link Coming From?
According to Moz Analytics we have a link coming from here: http://www.grayshadowfinancial.com/ The anchor text is earthquake prone map. I can't find the link, but if I cntrl+f "earthquake prone map" it shows in the find box, even though I can't see it. I'm guessing this is some spam tactic and they are hiding this with their CSS? Is there anyway I could see it? Best, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Reciprocal Links NoFollow
I am working on the SEO for a company that sells commercial construction materials and I am noticing that the vast majority of the older, authoritative construction related sites and directories require a reciprocal link to be linked to from their site. 1. If I create a reciprocating link, but nofollow/noindex that page, is that seen as blackhat? Will I see any benefit from this over a link passing page rank? 2. Will these reciprocating links hurt me, or are they worth the risk within a young portfolio? I am seeing well ranked sites listed such as justblinds.com, this would imply they reciprocated a link as well?
White Hat / Black Hat SEO | | GoogleMcDougald0 -
Link-Building - Directories
Hello, The SEO world is a bit confuse in the last months with the Google Antartic updates. Its normal since Google is trying to kill SEO to have more Adwords publicity results. My most recent doubt is about directories. I heard Matt Cutts from Google in a recent Google Hangout saying that registering a website in directorys was ok, but not the ideal method to become relevant in the internet world. However it seems that this procedure is not against the Google policies. Now, here in the forums, I already saw someone writing about adding your site to directories and how dangerous that situacion is. So, whats your opinion about adding your site to free and pay directories as first link-building strategy? If directories are out of the question, why SEOmoz as a huge list of paid directorys? Is SEOmoz outdate?
White Hat / Black Hat SEO | | PedroM1 -
Does anyone have any suggestions on removing spammy links?
I have some clients that recently got hit by "Penguin" they have several less than desireable backlinks that could be the issue? Does anyone have any suggestions on getting these removed? What are the odds that a webmaster on these spammy sites are going to remove them, and is it worth the time and effort?
White Hat / Black Hat SEO | | RonMedlin3 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0