Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I disavow links from pages that don't exist any more
-
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist.
There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file?
Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not?
Hope Im making sense
-
Sounds a plan. Thanks for your help bud, much appreciated.
-
My take, I'll just go ahead and start doing other things to improve it's current rankings. I could assign someone to go over links if another team member is available.
If I see improvements, within the next month, then that's a good sign already that you should continue and not worry about the dead links.
It takes google a long time to actually forget about those links pointing to your site. So if they are dead AND then you didnt notice any increases or drops in analytics, then they are pretty much ineffective so they shouldnt be a major obstacle. I think someone coined a term for it, ghost links or something. LOL.
-
Hi. I did go through GA several years back, think back to 2011, but didn't really see dramatic changes in traffic other than a general trend of just low organic traffic throughout. Keep in mind that it's an engineering site, so no thousands of visit per day... the keywords that are important for the site get below 1000 searcher per month (data from the days when Google Keyword Tool shared this info with us mortals).
That said, I do notice in roughly 60% of the links absolutely no regard for anchors, so some are www.domain.com/index.php, Company Name, some are Visit Site, some are Website etc. Some anchors are entire generic sentences like "your company provided great service, your entire team should be commended blah blah blah". And there are tons of backlinks from http://jennifers.tempdomainname.com...a domain that a weird animal as there's not much data on who they are, what they do and what the deal is with the domain name itself. Weird.
In all honesty, nothing in WMT or GA suggests that the site got hit by either Penguin or Panda....BUT, having a ton of links that originate from non-existing pages, pages with no thematic proximity to the client site, anchors that are as generic as "Great Service"...is it a plus to err on the side of caution and get them disavowed, or wait for a reason from Google and then do the link hygiene?
-
Hi Igor,
Seeing ezinearticles in there is definitely a red flag that tells you that it probably has web directories, article networks, blog networks, pliggs, guestbooks and other links from that time.
Maybe you can dig up some old analytics data, check out when the traffic dropped.
If you did not see any heavy anchor text usage, then the site must've gotten away with a sitewide penalty, I would assume it's just a few (or many, but not all) of the keywords that got hit so either way, youll need to clean up -> disavow the links if they are indeed like that. So that's probably a reason for it's low organic rankings.
That, and since it's old, it might have been affected by panda too.
-
Thanks for your response. Im about done with cleaning up the link list in very broad strokes, eliminating obvious poor quality links, so in a few hours I could have a big list for disavowing.
The site is very specific, mechanical engineering thing and they sell technology and consulting to GM, GE, Intel, Nasa... so backlinks from sites for rental properties and resorts do look shady....even if they do return a 200 status.
But...how vigilent is google now with all the Penguin updates about backlinks from non-related sites, and my client's site has tons of them? And if Majestic reports them to have zero trust flow, is there a benefit of having them at all?
Thanks.
-
Hi. Thanks for responding. WMT shows just a fraction of the links actually. about few thousand for the site that Majestic Historic reports 48k. But I dont have any notifications of issues. Im guessing that with all the Penguin updates most sites won't get any notifications and it's up to us SEO guys to figure out why rankings are so low.
About quality of the links, many do come from weird sites, and I've noticed ezinearticles too. Problem is that the 48k portfolio was built by non-seo experts and now, few years after the fact, Im stuck with a site that doesn't rank well and has no notifications in WMT. But can I take the lack of notification as evidence that the site has no backlinks problem, or do I read-in the problem in poor organic ranking?
-
If I would be in that similar situation I would not really care about it but if it didn’t took too much of my time, I would have included all of these in the disavow file too.
But if the page is not giving a 200 status, this shouldn’t really be a problem.
Hope this helps!
-
Hi Igor,
Do they still show up in Webmaster tools? Do you have a penalty because of those links that used to link to the site? If not then I wouldn't really worry about it and just prioritize other things and make that a side task.
Are the majority of them on bad looking domains? If you checked the link URL on archive.org, were they spammy links? Then go ahead and include them in the disavow list.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to link to multiple location pages
I am a Magician and have multiple location pages for each county I cover. I currently have them linked off the menu under locations/ <county>and also in the footer</county> However I have heard that a link from the page is much stronger, so I am experimenting with removing the Menu & Footer link and just linking to these pages from within the content. It's not really a navigation item and most people come in through search to the right page. Am I diluting the link by having it in the Menu/Page and Footer? I read a long time ago that Google only considers the first link to a page and ignores the rest - is that the case? Thanks Roger https://www.rogerlapin.co.uk/
Technical SEO | | Rogerperk0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Why isn't my homepage number #1 when searching my brand name?
Hi! So we recently (a month ago) lunched a new website, we have great content that updates everyday, we're active on social platforms, and we did all that's possible, at the moment, when it comes to on site optimization (a web developer will join our team this month and help us fix all the rest). When I search for our brand name all our social profiles come up first, after them we have a few inner pages from our different news sections, but our homepage is somewhere in the 2nd search page... What may be the reason for that? Is it just a matter of time or is there a problem with our homepage I'm unable to find? Thanks!
Technical SEO | | Orly-PP0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
Ecommerce website: Product page setup & SKU's
I manage an E-commerce website and we are looking to make some changes to our product pages to try and optimise them for search purposes and to try and improve the customer buying experience. This is where my head starts to hurt! Now, let's say I am selling a T shirt that comes in 4 sizes and 6 different colours. At the moment my website would have 24 products, each with pretty much the same content (maybe differing references to the colour & size). My idea is to change this and have 1 main product page for the T-shirt, but to have 24 product SKU's/variations that exist to give the exact product details. Some different ways I have been considering to do this: a) have drop-down fields on the product page that ask the customer to select their Tshirt size and colour. The image & price then changes on the page. b) All product 24 product SKUs sre listed under the main product with the 'Add to Cart' open next to each one. Each one would be clickable so a page it its own right. Would I need to set up a canonical links for each SKU that point to the top level product page? I'm obviously looking to minimise duplicate content but Im not exactly sure on how to set this up - its a big decision so I need to be 100% clear before signing off on anything. . Any other tips on how to do this or examples of good e-commerce websites that use product SKus well? Kind regards Tom
Technical SEO | | DHS_SH0 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | | PooleyK0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0 -
What's the difference between a category page and a content page
Hello, Little confused on this matter. From a website architectural and content stand point, what is the difference between a category page and a content page? So lets say I was going to build a website around tea. My home page would be about tea. My category pages would be: White Tea, Black Tea, Oolong Team and British Tea correct? ( I Would write content for each of these topics on their respective category pages correct?) Then suppose I wrote articles on organic white tea, white tea recipes, how to brew white team etc...( Are these content pages?) Do I think link FROM my category page ( White Tea) to my ( Content pages ie; Organic White Tea, white tea receipes etc) or do I link from my content page to my category page? I hope this makes sense. Thanks, Bill
Technical SEO | | wparlaman0