Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Search Console Showing 404 errors for product pages not in sitemap?
-
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url).
Is this expected? Will these errors eventually go away/stop being monitored by Google?
-
@woshea Implement 301 redirects from the old URLs to the new ones. This tells search engines that the old page has permanently moved to a new location. It also ensures that visitors who click on old links are redirected to the correct content.
-
Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
-
@woshea Yes, it is not uncommon for Google to show 404 errors for products with URL changes, even if the correct new URLs are listed in the sitemap. This is because Google's crawlers may take some time to recrawl and update their index with the new URLs.
Typically, these 404 errors should eventually go away and stop being monitored by Google once the search engine has fully indexed and recognized the new URLs. However, the time it takes for this process to happen can vary based on the frequency of Googlebot's crawls and the size of your website. I am also facing this issue in my site flyer maker app and resolve this issue using the below techniques.
-
Ensure that your sitemap is up-to-date and includes all the correct URLs for your products.
-
Check for any internal links on your website that may still be pointing to the old URL and update them to the new URL.
-
Use 301 redirects from the old URL to the new URL. For example, set up a 301 redirect from product old URL to product new URL. This tells Google and other search engines that the content has permanently moved to a new location.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are there ways to avoid false positive "soft 404s" by Google
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives. Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404. It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
Technical SEO | Apr 27, 2023, 9:24 AM | IrvCo_Interactive0 -
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | Dec 3, 2018, 10:56 AM | Extima-Christian0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | Jan 31, 2018, 5:25 PM | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
Unique page for each product variant? (Not eCommerce)
Hi Mozzers, Just looking for a little advice before I launch into a huge workload. We have landing pages for vehicle manufacturers. We then have anchor links in that page for each vehicle model that manufacturer has, with further info on the model further down the page. So we're toying with the idea of launching a unique page for each of the models rather than having them all on the same landing page. This will take an age and a minute but if it is worth it, we want to do it. Do you guys see a benefit to having unique pages for each model? Do you think it would attract more natural links? Would this help or hinder the manufacturer landing page in general? Should the manufacturer landing page be noindex so as to avoid duplicate content issues? I can see a lot of work and risk, just looking for a few opinions. PM for more info. Thanks a lot people, Jamie
Technical SEO | Oct 15, 2015, 9:06 AM | SanjidaKazi0 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | Jan 20, 2014, 3:17 PM | DHS_SH0 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | Jun 11, 2013, 4:56 PM | rhoadesjohn0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | Jun 2, 2013, 12:00 PM | reidsteven750 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | Jan 20, 2013, 9:53 PM | Prime850