Dead links-urls
-
What is the quickest way to get Google to clean up dead
link? I have 74,000 dead links reported back, i have added a robot txt to
disallow and added on Google list remove from my webmaster tool 4 months ago.
The same dead links also show on the open site explores.Thanks
-
thank you 410 was what i needed.
-
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
I had 3 inexperience developers, who did not how to write
the urls so they created many different versions. It is hard work out which is
the right version. I also have someone like BT after leavening there service 2
years ago they still seem to index my old links even though my domain is not
with them. They are all for 404 pages. -
Can you offer clarification as to what you view as a "dead link"?
If you mean a link to a page on your site which no longer exists, then your options are as follows:
-
take John's advice and 301 the link to the most relevant page on your site
-
allow the link to 404. 404 errors are a natural part of the internet. You should be sure to have the most helpful 404 page possible. Include your site's normal navigation, a search box, most popular articles, etc.
Blocking pages in robots.txt is usually a bad idea and is more likely to add to the problem rather then fix it. If you do not block the page in robots.txt, Google would then crawl the page, see the error and remove it from their index. Because you block the page they can no longer view the page so they will likely leave it in for longer.
Google normally promptly removes web pages submitted with the URL Removal Tool. The tool is designed to remove pages which can damage your site or others. For example, if confidential information was accidentally published. It is not designed to be used to remove 74k pages because you decided to remove them from your website.
If these dead links are simply pages you removed from your site, I advise you to remove the robots.txt block and then decide to 301 them or allow them to 404. Google should then clean up the links within 30-60 days.
If you want to speed up the process as much as possible, there is one other step you can take. Set a 410 (Gone) code for the pages. When Google receives a 404 response, they are unsure if the page is simply temporarily unavailable so they keep it in the index. If they receive a 410 response, you are telling Google the page is GONE and they can update their index faster.
-
-
Are the links 404 pages or just expired content?
-
The links are index, i need them really cleaned up, not a
redirect, and users/customers find it frustrating -
You could try setting up a 301 redirect in htaccess.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Woocommerce filter urls showing in crawl results, but not indexed?
I'm getting 100's of Duplicate Content warnings for a Woocommerce store I have. The urls are
Moz Pro | | JustinMurray
etc These don't seem to be indexed in google, and the canonical is for the shop base url. These seem to be simply urls generated by Woocommerce filters. Is this simply a false alarm from Moz crawl?0 -
Can open site explorer miss incoming links?
In open site explorer I only see one linking domain, eventhough I know of at least 1 other. Why is that?
Moz Pro | | ResourceLab0 -
My domain authority is 1, and I am not seeing any links to my site.
I know that there are links to my site, but nothing shows up. My website is www.mullinsgeoffrey.com. I have tried with and without the www. I am using wordpress for this site, but competitors are as well, and there authority seems fine. is there a hidden setting somewhere that is screwing things up?
Moz Pro | | mullinsgeoffrey0 -
Link reporting.
Is there a way in the Pro reporting where I can see a summary of the number of incoming links by type (blogs / news / wiki / dir / forums etc)? Even better, could the report give me an average Page Rank for each link type? Thanks,
Moz Pro | | CarlDarby0 -
Duplicate pages with canonical links still show as errors
On our CMS, there are duplicate pages such as /news, /news/, /news?page=1, /news/?page=1. From an SEO perspective, I'm not too worried, because I guess Google is pretty capable of sorting this out, but to be on the safe side, I've added canonical links. /news itself has no link, but all the other variants have links to "/news". (And if you go wild and add a bunch of random meaningless parameters, creating /news/?page=1&jim=jam&foo=bar&this=that, we will laugh at you and generate a canonical link back to "/news". We're clever like that.) So far so good. And everything appears to work fine. But SEOMoz is still flagging up errors about duplicate titles and duplicate content. If you click in, you'll see a "Note" on each error, showing that SEOMoz has found the canonical link. So SEOMoz knows the duplication isn't a problem, as we're using canonical links exactly the way they're supposed to be used, and yet is still flagging it as an error. Is this something I should be concerned about, or is it just a bug in SEOMoz?
Moz Pro | | LockyDotser0 -
About Links API
I'ma Japanese, So, I'm sorry in poor English.
Moz Pro | | flaminGoGo
Question about the API.
Will be returned as unauthorized api 'links' to the following request. http://lsapi.seomoz.com/linkscape/links/domain/blog?Scope=page_to_page&Sort=domain_authority&AccessID=xxx&Expires=xxx&Signature=xxx Is it OK in the request parameters?0 -
How come there are no links to my website according to SEOmoz Competive domain analysis, while in google webmaster i do see links.?
I dont see any links to at all when i do a Competitive Domain Analysis in SEOmoz. However i do see links in google webmaster tools. this strikes me as odd. Also when i use open site exployer my website dont seem te be found. In google im on page 9 on my focus keyword so i do think there are links to my site. I would like to know what i can do so i can analyse my links in seomoz Competitive domain analysis. Many thanks. url: http://www.sadpanda.nl
Moz Pro | | Aquive0 -
Www. part of url not showing on google search results.
When typing in fun translator into google UK, my website www.funtranslator.com is the 18th result on the 2nd page, however, there is an issue, it only shows funtranslator.com, without the www part, which is not what I want because the whole address to show up, because my campaign on seomoz pro and webmaster tools account is www.funtranslator.com not funtranslator.com. How can I resolve this. For not, I have a redirect that goes from funtranslator to www.funtranslator.com I want to know how to get the full www.url into the results instead of just url.com without the www part of the url. Also, will this affect my stats gathered in my campaign on my seomoz pro account?
Moz Pro | | RyanSMurphy1