Tracking a Crawl error
-
Hi All,
If you find a crawl error on your page. How do you find it?
The error only says the URL that is wrong but this is not the location. Can i drill down and find out more information?
Thank you!
-
Hi Martin,
Thanks for coming back to me.
You are spot on... I think I have been sitting at the desk to long today. Zoned out!
Yep, webtools in google shows the URL in full so I can find it now.
Thanks for your help
-
Hi Wayne, Just to clarify you are having issues with a crawl error in Webmaster tools, and when you click the url it works fine? You can use tools such as Google Webmaster tools > Fetch as Google bot and input the url to see how Google Bot see's it in HTML. You can also download a text based browser such as Lynx to view your website very similar to how Google bot see's it. It will be very hard to determine a Crawl Path, as if that information was available we could develop trends and get a little closer to their algorithm. Crawler trackers can, on occasion cause hinderence to the crawler.Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A crawl revealed two home pages
After doing a site crawl using the moz tool, I have found two home pages-www.domain.com/ and www.domain.com. Both URLS have the exact same metrics and I have set a preferred domain name in google, will this hurt seo? Should I claim the www.domain.com/ as well as www.domain.com and domain.com in the search console? Thanks
Technical SEO | | Tom3_150 -
Increase in Crawl Errors
I had a problem with a lot of crawl errors (on Google Search Console) a while back, due to the removal of a shopping cart. I thought I'd dealt with this & Google seemed to agree (see attached pic), but now they're all back with a vengeance! The crawl errors are all the old shop pages that I thought I'd made clear weren't there anymore. The sitemaps (using Yoast on Wordpress to generate these) all updated 16 Aug but the increase didn't happen till 18-20. How do I make it clear to Google that these pages are gone forever? Screen-Shot-2016-08-22-at-10.19.05.png
Technical SEO | | abisti20 -
Xml sitemaps giving 404 errors
We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml
Technical SEO | | ang0 -
404 error due to a page which requires a login
what do I do with 404 errors reported in webmaster tools that are actually URLs where users are clicking a link that requires them to log in (so they get sent to a login page). what's the best practice in these cases? Thanks in advance!
Technical SEO | | joshuakrafchin0 -
Javascript --can SE crawl?
I have a couple of nested div's. I'd like to do an onclick="location.href='http://www.example.com';" - within the outermost div so that all content within will link to one url. Can the Search Engines crawl this? Thanks!
Technical SEO | | Morris770 -
WP Blog Errors
My WP blog is adding my email during the crawl, and I am getting 200+ errors for similar to the following; http://www.cisaz.com/blog/2010/10-reasons-why-microsofts-internet-explorer-dominance-is-ending/tony@cisaz.net "tony@cisaz.net" is added to Every post. Any ideas how I fix it? I am using Yoast Plug in. Thanks Guys!
Technical SEO | | smstv0 -
404 Errors in Google Webmaster Tools
Hello, Google webmaster tools is returning our URLs as 404 errors: http://www.celebritynetworth.com/watch/D5GrrPEN9Oc/tom-mccarthy-floating/ When we enter the URL into the browser it loads the page just fine. Is there a way to determine why Google Webmaster Tools is returning a 404 error when the link loads perfectly fine in a browser? Thanks, Alex
Technical SEO | | Anti-Alex0 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70