Strange 404s in Screaming Frog
-
I just ran a website (Drupal) through screaming frog and the only 404s I found related to web pages which were the same as URLs already used on the website plus the company phone number so... www.company.com/[their phone number] - www.company.com/services[their phone number] - any ideas what might be causing this problem?
-
Hi Luke,
As the guys above replied with, sounds like an a href with a phone number
If you check the 'inlinks' (via the lower window tab), you'll be able to see the source of these errors (the pages they are located). Obviously you can then view the source code & find the exact link, and what might be the issue.
Hope that helps!
Feel free to pop through any further questions directly to our support btw (http://www.screamingfrog.co.uk/seo-spider/support/), I only spotted this via a Google alert.
(We try and reply super quick & will always look into any problems!)
Cheers.
Dan
-
This is typically caused by a link on the page that is not formed correctly.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop internal Dynamically created links that generate 404s
Hi I have question weather to block in Robots.txt or through No followed tags. Links that are part of a Dynamic Tab control that generates the tabs based on CMS content. These are internal links on the page which show content within a tab control. Would it stop these errors appearing in the Google report if we added a rel="nofollow" tag to the <a>tag, for example:</a> <a></a> <a></a>VIEW FROM SEATS? Is it better to use No followed tags or block it in Robots.txt? Thanks Cee
Intermediate & Advanced SEO | | mara.lature10 -
Strange: page no longer present in SERPS and I'm not sure why
I indexed a new page last week and it ranked 1st The page is still live, still registering sessions in analytics, registering activity in search console Why is it no longer present for the keyword in ranked first for on Friday?
Intermediate & Advanced SEO | | Jacksons_Fencing0 -
Screaming Frog returning both HTTP and HTTPS results...
Hi, About 10 months I switched from HTTP to HTTPS. I then switched back (long story). I noticed that Screaming Frog is picking up the HTTP and HTTPS version of the site. Maybe this doesn't matter, but I'd like to know why SF is doing that. The URL is: www.aerlawgroup.com Any feedback, including how to remove the HTTPS version, is greatly appreciated. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
Strange internal links and trying to improve PR ? - Please advise
Hi All, I've been looking at the internal links on my eCommerce site to try and improve PR and get it as efficient as possible so link juice isnt getting wasted etc and I've come across some odd ones I would like some advice on My website currently has between 125-146 links on every page (Sitemap approx 3500 pages). From what I read ,the ideal number of links is under 100 but can someone confirm is this is still the case ?..Is it a case of less is more , in terms of improving a page PR etc ? in terms of link juice strength etc so it's not getting diluted to unnecessary pages. One of my links is a bad url ( my domain + phone number for reason) which currently goes to a 404 page ?. - Is this okay or do we need to track down the link and remove it. I don't want link juice getting wasted as it's on every page. Another one of my links is my domain.name/# and another one with some characters after the # which both to the home page. Example www.domain.co.uk/# and www.domain.co.uk#abcde both go to homepage. Is this okay or am I potentially getting duplicate content as If I put these urls in , they go to my home page. I have a link on every page which opens up outlook (email) on the contact us. Should this really be changed to a button with a contact us form opening up instead ? I currently have 9 links on the bottom on every page i.e About it , delivery , hire terms,.contact us , trade accounts , privacy, sitemap. When I check , these pages seem to be my strongest pages in terms of PR. Is that because they are on every page?.. Should I look to reduce these links as they are accessible from the navigation menu apart from privacy and sitemap. Any advice on this would be greatly appreciated ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Strange 404s in GWT - "Linked From" pages that never existed
I’m having an issue with Google Webmaster Tools saying there are 404 errors on my site. When I look into my “Not Found” errors I see URLs like this one: Real-Estate-1/Rentals-Wanted-228/Myrtle-Beach-202/subcatsubc/ When I click on that and go to the “Linked From” tab, GWT says the page is being linked from http://www.myrtlebeach.com/Real-Estate-1/Rentals-Wanted-228/Myrtle-Beach-202/subcatsubc/ The problem here is that page has never existed on myrtlebeach.com, making it impossible for anything to be “linked from” that page. Many more strange URLs like this one are also showing as 404 errors. All of these contain “subcatsubc” somewhere in the URL. My Question: If that page has never existed on myrtlebeach.com, how is it possible to be linking to itself and causing a 404?
Intermediate & Advanced SEO | | Fuel0 -
Recent ranking drop followed by strange behavior in SERPS
Recently on the evening of August 5th almost all of the keywords our pages ranked highly for dropped by anywhere from 5 to 8 pages. The only activity during this time was an article that had been picked up by a major news outlet and then was apparently copied onto other sources with links to our domain and article. More puzzling though is rather than simply having the same page show up lower for a keyword, in a number of instances a different page is now shown for the result, often with less or no relevance to the keyword. In some cases, for a single keyword phrase we've seen as many as 10 different pages rotated throughout the day when performing a search. Prior to our rankings falling, we've never seen this behavior.
Intermediate & Advanced SEO | | BrianQuinn0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2