Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Search Console Showing 404 errors for product pages not in sitemap?
-
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url).
Is this expected? Will these errors eventually go away/stop being monitored by Google?
-
@woshea Implement 301 redirects from the old URLs to the new ones. This tells search engines that the old page has permanently moved to a new location. It also ensures that visitors who click on old links are redirected to the correct content.
-
Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
-
@woshea Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
Search Console Missing field 'mainEntity'
Hello,
SEO Tactics | | spaininternship
I am with a problem, in my site I added a faq with schema structure (https://internships-usa.eu/faq/). But is appearing the following problem in Search Console:
Missing field 'mainEntity' ["WebPage","FAQPage"],"@id":"https://internships-usa.eu/faq/#webpage","url":"https://internships-usa.eu/faq/","name":"Help Center - Internships USA","isPartOf":{"@id":"https://internships-usa.eu/#website"},"datePublished":"2022-05-31T14:43:15+00:00","dateModified":"2022-06-01T08:07:13+00:00","breadcrumb":{"@id":"https://internships-usa.eu/faq/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https://internships-usa.eu/faq/"]}]}, What do I have to do to solve this?0 -
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
Removing site subdomains from Google search
Hi everyone, I hope you are having a good week? My website has several subdomains that I had shut down some time back and pages on these subdomains are still appearing in the Google search result pages. I want all the URLs from these subdomains to stop appearing in the Google search result pages and I was hoping to see if anyone can help me with this. The subdomains are no longer under my control as I don't have web hosting for these sites (so these subdomain sites just show a default hosting server page). Because of this, I cannot verify these in search console and submit a url/site removal request to Google. In total, there are about 70 pages from these subdomains showing up in Google at the moment and I'm concerned in case these pages have any negative impacts on my SEO. Thanks for taking the time to read my post.
Technical SEO | | QuantumWeb620 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Yoast SEO. After set up 404 error pages
Hello all, Something strange happened with my blog site. I recently signed to MOZ tools. Initially everything was fine, but during my last crawl I got loads of 404
Technical SEO | | A_Fotografy
pages. Few days ago I was tweaking some settings in SEO plugin according to this post https://moz.com/blog/setup-wordpress-for-seo-success What I noticed was that 404 pages were coming from my blog posts, but for
some reason category was missing in those posts. For example this link is 404
https://a-fotografy.co.uk/inchcolm-island-wedding-photography-bailie The one with category is https://a-fotografy.co.uk/wedding-pictures/inchcolm-island-wedding-photography-bailie/ So basically for some reason category was missing. Please let me know how can I fix this instead of doing hundreds of
redirects now. Thank you,
Regards,
Armands0 -
Is Google suppressing a page from results - if so why?
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again. Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th. I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results. However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages. I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well. I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
Technical SEO | | Datel1 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850