Handling a Huge Amount of Crawl Errors
-
HI all,
I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit.
-
404 Erorrs: >80'000
-
Soft 404 Errors: 300
-
500 Errors: 1600
All of the above reported in GWT.
Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing.
What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind.
So my question is: Generally, what is the appropriate way of handling this?
-
Telling the client that he has to investigate that (I gave my best to at least report the errors)
-
Engaging my firm further and get a developer from my side to investigate?
Thanks in advance!!
-
-
Usually an on page audit lists all of the problems and possible reasons why they are happening, not in depth info on how to fix all the issues. That is usually the next phase, "do you want me to work on the site or do you want your dev team to track down the cause of the issues and fix them"
It also depends what type of contract you have with him of course.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can bots crawl this homepage's content?
The website is https://ashleydouglas.com.au/ I tried using http://www.seo-browser.com/ to see if bots could see the content on the site, but the tool was unable to retrieve the page. I used mobile-friendly test and it just rendered some menu links - no content and images. I also used Fetch and Render on Search Console. The result for 'how google sees the page' and 'how a visitor sees the page' are the same and only showing the main header image. Anything below isn't shown. Does this mean that bots can't actually read all content on the page past the header image? I'm not well versed with what's going on with the code. Why are the elements below the header not rendering? Is it the theme? Plugins? Thank you.
On-Page Optimization | | nhhernandez0 -
What is the perfect way to handle multiple sitemaps index in Search Console?
Hello friends, I have this doubt for a long and i want to share it with you. In our agency many clients have a PHP template for the home page of their sites, and also have a blog with wordpress as CMS. When i am optimizing sitemaps, I have two separate files, an index of Sitemaps created with Wordpress SEO by Yoast (which inside has separate Sitemaps tags, categories, posts, pages, authors, etc.) and on the other hand the home page sitemap with the subsections. As you know the sitemap generated by "Wordpress SEO by Yoast" is dynamic as it creates the sitemap according to current site content, and is updated every time a new entry is raised or modify any URL. This makes it very practical. I can not have a unique index sitemap sitemaps nesting inside another, as it is not allowed by Google or Sitemap protocol. I read in the Google Support you can upload multiple sitemaps to Search Console but does not say anywhere on upload multiple sitemaps index, or a combination thereof. In my case, I would have to upload two separately files, the dynamically generated with wordpress and the manual created for the PHP template. In my opinion there is no problem and Google will index everything properly performing it this way, but I wanted to share it with you to see how you solve this problem and what experiences had. Thanks and best regards.
On-Page Optimization | | NachoRetta1 -
Duplicate content errors
I have multiple duplicate content errors in my crawl diagnostics. The problem is though that i already took care of these problems with the canonical tag but MOZ keeps saying there is a problem. For example this page http://www.letspump.dk/produkter/56-aminosyre/ has a canonical tag, but moz still says it has an error. Why is that?
On-Page Optimization | | toejklemme0 -
Handling multiple locations in the footer
I have a client with several locations. Should I include only the main office's address in the footer? The client is wanting to add them all.
On-Page Optimization | | SearchParty0 -
What's the best way to handle crawling of photo gallery?
When you have a photo gallery with many search filters and loads and loads of pages, is it best to block all the filters and use google's pagination code? Ex: http://photo.net/gallery/photocritique/filter This site has pages for many different queries. While the page titles are unique, the pages are showing duplicated content.
On-Page Optimization | | cakelady0 -
I built a website on magentogo - IrisScottPrints.com. The seomoz crawl report states 301 rel canonical crawl notices. What if anything should I change?
Wondering if I should remove "IRIS SCOTT PRINTS |" from all the title tags and/or change the url structure of the pages, to not include the breadcrumbs... I don't really understand the whole rel canonical structure thing. Also lots of errors on page title too long - does that really matter? Lots of faith in everyone here. Thanks in advance. Marcia
On-Page Optimization | | RedTrout0 -
Changing Subfolder that has been crawled before
Question: I am using a wordpress multisite and I enabled the crawl options yesterday www.abc.com/subfolder <-original but i find that www.abc.com/sub is good enough I checked the site:abc.com but I find that my pages in the /subfolder has been crawled before. Can I just change it to www.abc.com/sub or it will raise duplicate content issue?
On-Page Optimization | | joony20080 -
Can my amount of internal linking seem spammy ?
Ecommerce site. I am optimizing for each producer of products on a separate page. Atm my provider does lack some functions(i cant put in H1 , title and can put text only product pictures) on the product pages - like here http://www.epleskrinet.no/smafolk/M_23 . The are updating within 1-2 months to allow me do this. This has led to actually some of the products themself ranking higher that the producer page. what Ive been doing is to put anchor text and link back to the producer page for all 9 products on this page. Is that a problem or should i just do it like this ? thanks
On-Page Optimization | | danlae0