Unsolved Google Search Console Still Reporting Errors After Fixes
-
Hello,
I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site.
I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling.
According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails.
What could be going on here? How can we resolve these error in GSC.
-
Here are some potential explanations and steps you can take to resolve the errors in GSC:
Caching: Sometimes, GSC may still be using cached data and not reflecting the recent changes you made to your website. To ensure you're seeing the most up-to-date information, try clearing your browser cache or using an incognito window to access GSC.
Delayed Processing: It's possible that Google's systems have not yet processed the changes you made to your website. Although Google typically crawls and indexes websites regularly, it can take some time for the changes to be fully reflected in GSC. Patience is key here, and you may need to wait for Google to catch up.
Incorrect Implementation of Redirects: Double-check that the redirects you implemented are correctly set up. Make sure they are functioning as intended and redirecting users and search engines to the appropriate pages. You can use tools like Redirect Checker to verify the redirects.
Check Robots.txt: Ensure that your website's robots.txt file is not blocking Googlebot from accessing the necessary URLs. Verify that the redirected and fixed pages are not disallowed in the robots.txt file.
Verify Correct Domain Property: Ensure that you have selected the correct domain property in GSC that corresponds to the website where you made the changes. It's possible that you might be validating the wrong property, leading to repeated failures.
Inspect URL Tool: Utilize the "Inspect URL" tool in GSC to manually check specific URLs and see how Google is currently processing them. This tool provides information about indexing status, crawling issues, and any potential errors encountered.
Re-validate the Fixes: If you have already submitted the fixes for validation in GSC and they failed, try submitting them again. Sometimes, the validation process can encounter temporary glitches or errors.
If you have taken the appropriate steps and the validation failures persist in GSC, it may be worth reaching out to Google's support team for further assistance. They can help troubleshoot the specific issues you are facing and provide guidance on resolving the errors.
-
Facing the same error with redirects even after the fix on our website https://ecomfist.com/.
-
@tif-swedensky It usaly takes bettwen a week to 3month to show the right results do not worry about that if you fixed good to go
-
Hi! Google Search Console has this issue, I would recommend not to pay much attention to it. If you know that everything's correct on the website, than you don't need to worry just because of Search Console issues.
-
In this case, it's likely that the Google bots may have crawled through your site before you fixed the errors and haven't yet recrawled to detect the changes. To fix this issue, you'll need to invest in premium SEO tools such as Ahrefs or Screaming Frog that can audit your website both before and after you make changes. Once you have them in place, take screenshots of the findings both before and after fixing the issues and send those to your client so they can see the improvements that have been made.
To give you an example, I recently encountered a similar issue while working with a medical billing company named HMS USA LLC. After running some SEO audits and making various fixes, the GSC errors had been cleared. However, it took a few attempts to get it right as the changes weren't detected on the first recrawl.
Hopefully, this information is useful and helps you understand why your GSC issues may still be showing up after being fixed. Good luck!
-
Hi,
We have had the similar problem before. We are an e-commerce company with the brand name VANCARO. As you know the user experience is very important for an e-commerce company. So we are very seriouse about the problems reported by GSC. But sometimes the update of GSC may be delayed. You need to observe a little more time. Or I can share you anoter tool : https://pagespeed.web.dev/. Hope it can help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved How much time does it take for Google to read the Sitemap?
Hi there, I could use your help with something. Last week, I submitted my sitemap in the search console to improve my website's visibility on Google. Unfortunately, I got an error message saying that Google is not reading my sitemap. I'm not sure what went wrong. Could you take a look at my site (OceanXD.org) and let me know if there's anything I can do to fix the issue? I would appreciate your help. Thank you so much!
Intermediate & Advanced SEO | | OceanXD1 -
Duplicate content homepage - Google canonical 'N/A'?
Hi, I redesigned a clients website and launched it two weeks ago. Since then, I have 301 redirected all old URL's in Google's search results to their counterparts on the new site. However, none of the new pages are appearing in the search results and even the homepage has disappeared. Only old site links are appearing (even though the old website has been taken down ) and in GSC, it's stating that: Page is not indexed: Duplicate, Google chose different canonical than user However, when I try to understand how to fix the issue and see which URL it is claiming to be a duplicate of, it says: Google-selected canonical: N/A It says that the last crawl was only yesterday - how can I possibly fix it without knowing which page it says it's a duplicate of? Is this something that just takes time, or is it permanent? I would understand if it was just Google taking time to crawl the pages and index but it seems to be adamant it's not going to show any of them at all. 55.png
Technical SEO | | goliath910 -
Canonical error from Google
Moz couldn't explain this properly and I don't understand how to fix it. Google emailed this morning saying "Alternate page with proper canonical tag." Moz also kinda complains about the main URL and the main URL/index.html being duplicate. Of course they are. The main URL doesn't work without the index.html page. What am I missing? How can I fix this to eliminate this duplicate problem which to me isn't a problem?
Technical SEO | | RVForce0 -
Google News and Discover down by a lot
Hi,
Technical SEO | | SolenneGINX
Could you help me understand why my website's Google News and Discover Performance dropped suddenly and drastically all of a sudden in November? numbers seem to pick up a little bit again but nowhere close what we used to see before then0 -
Unsolved Almost every new page become Discovered - currently not indexed
Almost every new page that I create becomes Discovered - currently not indexed. It started a couple of months ago, before that all pages were indexed within a couple of weeks. Now there are pages that have not been indexed since the beginning of September. From a technical point of view, the pages are fine and acceptable for a Google bot. The pages are in the sitemap and have content. Basically, these are texts of 1000+ or 2000+ words. I've tried adding new content to pages and even transferring content to a new page with a different url. But in this way, I managed to index only a couple of pages. Has anyone encountered a similar problem?
Product Support | | roadlexx
Could it be that until September of this year, I hadn't added new content to the site for several months?
Please help, I am already losing my heart.0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
PDF in search results?
Hello community! I am not an SEO professional, though I am a practitioner, I would say. I am seeking a solution on behalf of a friend. If you search the term "Peter Blatt" you will discover a "black eye" on the first page, towards the bottom of SERPs. It's a PDF published on the Florida Department of Financial Services website regarding the final order for a settlement he and his company ("Blatt Financial Group") reached with the state as it related to professional conduct allegations. Does anyone have any advice on how to address this? I don't want "game" the search engines, but at the same time, this document looks really scary and much worse than it actually is to people, and I would love for it do drop below page one. Any advice or suggestions from the community? Thanks! Tom
Technical SEO | | 800GoldLaw0 -
Google Bot Noindex
If a site has the tag, can it still be flagged for duplicate content?
Technical SEO | | MayflyInternet0