Unsolved Google Search Console Still Reporting Errors After Fixes
-
Hello,
I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site.
I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling.
According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails.
What could be going on here? How can we resolve these error in GSC.
-
Here are some potential explanations and steps you can take to resolve the errors in GSC:
Caching: Sometimes, GSC may still be using cached data and not reflecting the recent changes you made to your website. To ensure you're seeing the most up-to-date information, try clearing your browser cache or using an incognito window to access GSC.
Delayed Processing: It's possible that Google's systems have not yet processed the changes you made to your website. Although Google typically crawls and indexes websites regularly, it can take some time for the changes to be fully reflected in GSC. Patience is key here, and you may need to wait for Google to catch up.
Incorrect Implementation of Redirects: Double-check that the redirects you implemented are correctly set up. Make sure they are functioning as intended and redirecting users and search engines to the appropriate pages. You can use tools like Redirect Checker to verify the redirects.
Check Robots.txt: Ensure that your website's robots.txt file is not blocking Googlebot from accessing the necessary URLs. Verify that the redirected and fixed pages are not disallowed in the robots.txt file.
Verify Correct Domain Property: Ensure that you have selected the correct domain property in GSC that corresponds to the website where you made the changes. It's possible that you might be validating the wrong property, leading to repeated failures.
Inspect URL Tool: Utilize the "Inspect URL" tool in GSC to manually check specific URLs and see how Google is currently processing them. This tool provides information about indexing status, crawling issues, and any potential errors encountered.
Re-validate the Fixes: If you have already submitted the fixes for validation in GSC and they failed, try submitting them again. Sometimes, the validation process can encounter temporary glitches or errors.
If you have taken the appropriate steps and the validation failures persist in GSC, it may be worth reaching out to Google's support team for further assistance. They can help troubleshoot the specific issues you are facing and provide guidance on resolving the errors.
-
Facing the same error with redirects even after the fix on our website https://ecomfist.com/.
-
@tif-swedensky It usaly takes bettwen a week to 3month to show the right results do not worry about that if you fixed good to go
-
Hi! Google Search Console has this issue, I would recommend not to pay much attention to it. If you know that everything's correct on the website, than you don't need to worry just because of Search Console issues.
-
In this case, it's likely that the Google bots may have crawled through your site before you fixed the errors and haven't yet recrawled to detect the changes. To fix this issue, you'll need to invest in premium SEO tools such as Ahrefs or Screaming Frog that can audit your website both before and after you make changes. Once you have them in place, take screenshots of the findings both before and after fixing the issues and send those to your client so they can see the improvements that have been made.
To give you an example, I recently encountered a similar issue while working with a medical billing company named HMS USA LLC. After running some SEO audits and making various fixes, the GSC errors had been cleared. However, it took a few attempts to get it right as the changes weren't detected on the first recrawl.
Hopefully, this information is useful and helps you understand why your GSC issues may still be showing up after being fixed. Good luck!
-
Hi,
We have had the similar problem before. We are an e-commerce company with the brand name VANCARO. As you know the user experience is very important for an e-commerce company. So we are very seriouse about the problems reported by GSC. But sometimes the update of GSC may be delayed. You need to observe a little more time. Or I can share you anoter tool : https://pagespeed.web.dev/. Hope it can help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Strange - Search Console page indexing "../Detected" as 404
Anyone seen this lately? All of a sudden Google Search Console is insisting in Page indexing that there is a 404 for a page that has never existed on our client's site: https://........com.au/Detected We've noticed this across a number of sites, precisely in this way with a capitalised "/Detected" To me it looks like something spammy is being submitted to the SERPs (somehow) and Google is trying to index that and then getting a 404. Naturally MOZ isn't picking it up, cause the page simply never existed - it's just happening in Search Console 2afc7e35-71e4-4e25-80a3-690bf10776a7.png It comes and it goes in the 404 alerts in Console and is really annoying. I reckon it started happening late 2022.
Reporting & Analytics | | DanielDL0 -
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Unsolved Have we been penalised?
Hey Community, We need help! Have we been penalised, or is there some technical SEO issue that is stopping our service pages from being properly read? Website: www.digitalnext.com.au In July 2021, we suffered a huge drop in coverage for both short and longtail keywords. We thought that this could have been because of the link spam, core web vitals or core update around that time period. SEMRush: https://gyazo.com/d85bd2541abd7c5ed2e33edecc62854c
Technical SEO | | StevenLord
GSC: https://gyazo.com/c1d689aff3506d5d4194848e625af6ec There is no manual action within GSC and we have historically ranked page 1 for super competitive keywords. After waiting some time thinking it was an error, we have then have taken the following actions: Launched new website. Rewrote all page content (except blog posts). Ensured each page passes core web vitals. Submitted a backlink detox. Removed a website that was spoofing our old one. Introduced strong pillar and cluster internal link structure. After 3 months of the new website, none of our core terms has come back and we are struggling for visibility. We still rank for some super long-tail keywords but this is the lowest amount of visibility we have had in over 5 years. Every time we launch a blog post it does rank for competitive keywords, yet the old keywords are still completely missing. It almost feels like any URLs that used to rank for core terms are being penalised. So, I am wondering whether this is a penalisation (and what algorithm), or, there is something wrong with the structure of our service pages for them to not rank. Look forward to hearing from you
Steven0 -
Webpages & Images Index Graph Gone Down Badly in Google Search Console Why?
Hello All, What is going on with Sitemap Index Status in Google Search Console :- Webpages Submitted - 35000 index showing 21000 whereas previously approx 34500 were index. Images Submitted - 85000 index showing - 11000 whereas previously approx 80000 were index. Whereas when I search in google site:abcd.com is it showing approx 27000 index for webpages. No message from google for penalty or warning etc.Please help.
Technical SEO | | wright3350 -
Updating Old Content - Should I update In Search Console?
Hey Mozzers If I'm updating old content on a site (for example adding some copy and adding some new links in the page) - Is it important to get Google to recrawl with the feature in webmaster tools? If you didn't do this, could you be waiting a long time for google to recrawl the URL? Cheers!
Technical SEO | | wearehappymedia0 -
My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
Excuse me for posting this here, I wasn't having much luck going through GWT support. We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere. I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website. If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
Technical SEO | | jampaper0 -
First Crawl Report
Just joined SEOMoz today and am slightly overwhelmed, but excited about learning loads from it. I've just received my Crawl Report and there is a
Technical SEO | | iainmoran
404 : UserPreemptionError:
http://www.iainmoran.com/comments/feed/ This is a WordPress site and I've no idea what the best course of action to take. I've done some searching on Google and a couple of sites suggest removing that url from within the robots.txt file. I'm using the Yoast Plugin which apparently creates a robots.txt file, but I can't see any way to edit it. Is there another solution for resolving the 404 error? Many thanks, Iain.0 -
RSS Feed Errors in Google
We recently (2 months ago) launched RSS feeds for the category pages on our site. Last week we started seeing error pages in Webmaster Tools' Crawl Errors report pop up for feeds of old pages that have been deleted from the site, deleted from the sitemap, and not in Google's index since long before we launched the RSS feeds. Example: www.mysite.com/super-old-page/feed/ I checked and both the URL for the feed and the URL for the actual page are returning 404 statuses. www.mysite.com/super-old-page/ is also showing up in our Crawl Errors. Its been deleted for months but Webmaster Tools is very slow to remove the page from their Crawl Error report. Where is Google finding these feeds that never existed?
Technical SEO | | Hakkasan0